Using size() method of DataFrameGroupBy object, we can retrieve the number of elements in each group. ‘size()’ method returns a Series that contains the group names as the index and the corresponding sizes as the values.
Example
data = {'Name': ['Krishna', 'Chamu', 'Joel', 'Gopi', 'Sravya', "Raj"],
'Age': [34, 25, 29, 41, 52, 23],
'City': ['Bangalore', 'Chennai', 'Hyderabad', 'Hyderabad', 'Bangalore', 'Chennai'],
'Gender': ['Male', 'Female', 'Male', 'Male', 'Female', 'Male']}
df = pd.DataFrame(data)
group_by_city = df.groupby('City')
count_series_by_group_name = group_by_city.size()
In the example above, I defined a DataFrame ‘df with’ columns "Name", "Age", "City" and "Gender". We group the DataFrame by the column ‘City’ and store the result in a DataFrameGroupBy object named ’group_by_city’.
By calling the size() method on the grouped object, we obtain a Series named count_series_by_group_name. This Series represents the number of elements in each group. The index of the Series contains the group names ('Bangalore', 'Hyderabad' and 'Chennai' in this case), and the values represent the respective group sizes.
Find the below working application.
count_records_in_each_group.py
import pandas as pd
# Create a sample DataFrame
data = {'Name': ['Krishna', 'Chamu', 'Joel', 'Gopi', 'Sravya', "Raj"],
'Age': [34, 25, 29, 41, 52, 23],
'City': ['Bangalore', 'Chennai', 'Hyderabad', 'Hyderabad', 'Bangalore', 'Chennai'],
'Gender': ['Male', 'Female', 'Male', 'Male', 'Female', 'Male']}
df = pd.DataFrame(data)
print(df)
group_by_city = df.groupby('City')
count_series_by_group_name = group_by_city.size()
print('\nGroup Name\tTotal Records')
for index_label in count_series_by_group_name.index:
value = count_series_by_group_name[index_label]
print(f"{index_label} \t {value}")
Output
Name Age City Gender 0 Krishna 34 Bangalore Male 1 Chamu 25 Chennai Female 2 Joel 29 Hyderabad Male 3 Gopi 41 Hyderabad Male 4 Sravya 52 Bangalore Female 5 Raj 23 Chennai Male Group Name Total Records Bangalore 2 Chennai 2 Hyderabad 2
Previous Next Home
No comments:
Post a Comment