Plotly plot for the very small numerical data like a step function

I am trying to plot with .line or plotly.graph_objects .Scatter with mode = ‘lines’ with the data that varies x value from 10E-15 to 10E+7 and y value from 10E-2 to 10E+6.

For both cases, the plot shows something like a step function in the lower range, especially under 10E-9 on the x-axis as shown in the figures, regardless of Log or Linear. When I plot this data with Gnuplot or Excel, the line looks very smooth.

I tried line_shape, line_smoothing, line_simplify options, but in none of the cases, the plot never changed. Is there any idea to make this plot completely smooth?

Hi @fisbar ,

Have you tried instead of line plot?

fig = px.scatter(data_frame=df, x=“Xnumbers”, y=“Ynumbers”, log_x=True, log_y=True)[0].update(mode=‘lines’)

Seems to work on my end! Here is a log-log plot with markers and lines for illustration.
Log-Log Scatter with markers and lines|690x202

Here is a plot spanning more logs. Not recreating your issue in scatter plot (module versions: plotly==4.0.0).

Hi NickL, thank you very much for your idea. I tried px.scatter and update(mode=‘lines’) but it also doesn’t work for my data below x=E-9. The point is that it is no problem above x=E-9.

Your idea made me realized that plotly plots the data below x=10E-9 very wired way. It doesn’t plot the data at the exact [X-Y] position, instead, it plots the data at every one step i.e., 1.0E-11, 2.0E-11, 3.0E-11, …9.0E-11!

I wonder if there are any options to read small numerical numbers. I really appreciate any help.

Following is the table I use for the plot here.
1.00000E-11 1.75738E+06
1.03223E-11 1.72972E+06
1.06549E-11 1.70251E+06
1.09984E-11 1.67572E+06
1.13527E-11 1.64936E+06
1.17186E-11 1.62340E+06
1.20962E-11 1.59787E+06
1.24861E-11 1.57272E+06
1.28884E-11 1.54798E+06
1.33038E-11 1.52362E+06
1.37325E-11 1.49965E+06
1.41751E-11 1.47605E+06
1.46318E-11 1.45283E+06
1.51034E-11 1.42997E+06
1.55901E-11 1.40748E+06
1.60926E-11 1.38533E+06
1.66111E-11 1.36353E+06
1.71465E-11 1.34208E+06
1.76990E-11 1.32096E+06
1.82694E-11 1.30018E+06
1.88581E-11 1.27972E+06
1.94659E-11 1.25958E+06
2.00931E-11 1.23977E+06
2.07407E-11 1.22026E+06
2.14090E-11 1.20106E+06
2.20991E-11 1.18216E+06
2.28111E-11 1.16357E+06
2.35464E-11 1.14526E+06
2.43051E-11 1.12724E+06
2.50884E-11 1.10950E+06
2.58968E-11 1.09205E+06
2.67315E-11 1.07486E+06
2.75928E-11 1.05795E+06
2.84822E-11 1.04130E+06
2.93999E-11 1.02492E+06
3.03475E-11 1.00879E+06
3.13253E-11 9.92924E+05
3.23350E-11 9.77300E+05
3.33769E-11 9.61925E+05
3.44526E-11 9.46788E+05
3.55627E-11 9.31893E+05
3.67089E-11 9.17229E+05
3.78918E-11 9.02799E+05
3.91130E-11 8.88593E+05
4.03734E-11 8.74613E+05
4.16746E-11 8.60851E+05
4.30174E-11 8.47307E+05
4.44039E-11 8.33974E+05
4.58347E-11 8.20854E+05
4.73119E-11 8.07937E+05
4.88364E-11 7.95227E+05
5.04104E-11 7.82713E+05
5.20348E-11 7.70399E+05
5.37119E-11 7.58276E+05
5.54426E-11 7.46347E+05
5.72295E-11 7.34603E+05
5.90736E-11 7.23046E+05
6.09775E-11 7.11668E+05
6.29424E-11 7.00472E+05
6.49710E-11 6.89449E+05
6.70645E-11 6.78602E+05
6.92260E-11 6.67924E+05
7.14566E-11 6.57416E+05
7.37596E-11 6.47071E+05
7.61364E-11 6.36891E+05
7.85902E-11 6.26869E+05
8.11227E-11 6.17007E+05
8.37372E-11 6.07298E+05
8.64355E-11 5.97743E+05
8.92212E-11 5.88337E+05
9.20962E-11 5.79081E+05
9.50644E-11 5.69969E+05
9.81277E-11 5.61002E+05
1.01291E-10 5.52173E+05
1.04554E-10 5.43487E+05
1.07924E-10 5.34933E+05
1.11402E-10 5.26518E+05
1.14992E-10 5.18232E+05
1.18697E-10 5.10080E+05
1.22523E-10 5.02053E+05
1.26471E-10 4.94155E+05
1.30547E-10 4.86378E+05
1.34754E-10 4.78727E+05
1.39097E-10 4.71193E+05
1.43579E-10 4.63780E+05
1.48207E-10 4.56481E+05
1.52982E-10 4.49300E+05
1.57913E-10 4.42230E+05
1.63001E-10 4.35272E+05
1.68254E-10 4.28422E+05
1.73676E-10 4.21682E+05
1.79274E-10 4.15046E+05
1.85050E-10 4.08517E+05
1.91014E-10 4.02088E+05
1.97169E-10 3.95762E+05
2.03524E-10 3.89534E+05
2.10082E-10 3.83406E+05
2.16853E-10 3.77372E+05
2.23840E-10 3.71435E+05
2.31055E-10 3.65590E+05
2.38500E-10 3.59838E+05
2.46187E-10 3.54175E+05
2.54120E-10 3.48603E+05
2.62310E-10 3.43117E+05
2.70762E-10 3.37719E+05
2.79489E-10 3.32404E+05
2.88494E-10 3.27175E+05
2.97793E-10 3.22026E+05
3.07388E-10 3.16959E+05
3.17295E-10 3.11971E+05
3.27519E-10 3.07063E+05
3.38075E-10 3.02231E+05
3.48969E-10 2.97475E+05
3.60216E-10 2.92794E+05
3.71823E-10 2.88187E+05
3.83807E-10 2.83652E+05
3.96174E-10 2.79189E+05
4.08943E-10 2.74795E+05
4.22120E-10 2.70472E+05
4.35725E-10 2.66215E+05
4.49765E-10 2.62027E+05
4.64261E-10 2.57903E+05
4.79221E-10 2.53845E+05
4.94666E-10 2.49850E+05
5.10605E-10 2.45919E+05
5.27062E-10 2.42049E+05
5.44045E-10 2.38240E+05
5.61580E-10 2.34491E+05
5.79675E-10 2.30801E+05
5.98358E-10 2.27169E+05
6.17639E-10 2.23594E+05

I pasted your data into a CSV and loaded it up like this:

import pandas as pd
df = pd.read_csv("~/Downloads/xy.csv")
import as px
px.scatter(df, x="X", y="Y", log_x=True, log_y=True)

and got the following output, as expected:

… it seems like somehow the data you’ve read in is being clipped at some fixed precision or something?

Hi nicolaskruchten! Thanks for your check and comment. OK, I confirmed that too. Actually, the data contains more than 60,000 lines from x = 10E+2 down to x = 10E-11. I cannot share the data, but maybe that would be the reason…? I will try to skip some data points.

Yes, thanks for your hint, nicolaskruchten!

I figure out the reason. I convert the DataFrame into JSON format by df.to_json() and reconstruct DataFrame by pd.read_json(), because I need to share the DataFrame between some Dash callbacks. While to do so, the data type for X-axis has been changed.

0 1.000000e-11 1.757380e+06
1 1.032230e-11 1.729720e+06
2 1.065490e-11 1.702510e+06
3 1.099840e-11 1.675720e+06
4 1.135270e-11 1.649360e+06

0 0.0 1.757380e+06
1 0.0 1.729720e+06
2 0.0 1.702510e+06
3 0.0 1.675720e+06
4 0.0 1.649360e+06

So my problem was definitely NOT Plotly’s problem. I will store/reconstruct DataFrame with the precisely specify data type.

Thanks for all your help!!!

Using df.to_json(), double_precision is set default to 10. So I could fix this problem using df.to_json(double_precision=12).