Hiya. Can I filter which scattergeo text labels show based on criteria ‘count’ count column is greater than or equal to a value for example 34.
Alternately, can I pass a specific list of ‘state_code’ abbreviations to label?
I want only the states with the highest counts to be labeled with their counts, in other words hide the count labels on states except California, Texas, New York.
# make a minimal dataframe
data = [['CA', 42], ['NY', 41], ['TX', 34], ['FL', 31], ['PA', 26]]
dfdemo_map = pd.DataFrame(data, columns=['State_code', 'count'])
# plot a choropleth with color range by count per state
fig = px.choropleth(dfdemo_map,
# center the title
fig.update_layout(title_text='How Many Survey Respondents from each State', title_x=0.5)
# label states with count
text = dfdemo_map['count'],
mode = 'text',
my map currently looks like this.
I don’t see any way to limit the data in the scattergeo map, so I can update the data directly after creating the graph.
new_texts = [state+':'+str(cnt) for state,cnt in zip(dfdemo_map['State_code'][:3], dfdemo_map['count'][:3])]
fig.data['text'] = new_texts
fig.data['locations'] = ['CA', 'NY', 'TX']
@r-beginners I still want to show the choropleth colors on all the states however. The speciifc numerical labels I only want to show on the states with the largest counts. This looks like it hides the colors for everything else, but maybe I can overlay that on top to double up those few states?
One way to do it might be by adding a new column.
In this column, the values will remain as is if they are greater than
dfdemo_map['count_cond'] = [x if x > 34 else np.nan for x in dfdemo_map['count']]
text = dfdemo_map['count_cond'], # <---------
Just tried it with your code, seems to work:
Hope this is what you were looking for.
The image is limited to the states since you are using data provided by you. If you want to show the maximum count, that would be the method @eliasdabbas is suggesting.
@eliasdabbas great idea thank you! I sometimes forget to try a solution by changing the data I’m plotting.
Thank you again, this definitely a good alternative I’ll keep in mind!
@kmhurchla Glad it worked