Reputation: 163
So if I have a set of data, say "GnuTest.dat" and I fit f(x) = a*x**2 to it, then plot them both, is there a way to have the color of f(x) vary with f(x)?
This is what I have at the moment:
set autoscale
set term aqua size 700,500
set palette positive nops_allcF maxcolors 1 gamma 1.5 color model HSV
set palette defined ( 0 0 1 1 , 1 1 1 1 )
unset log
set termoption enhanced
set xtics 1
set mxtics 4
set ytics 1
set mytics 4
set title "Gnuplot Script Template"
set xlabel "x"
set ylabel "y"
set size ratio -1
a = 1
b=0.5
f(x) = a*x**2 +b
stats "GnuTest.rtf" u 1:2
print STATS_max_y;
fit f(x) "GnuTest.rtf" using 1:2 via a,b
plot "GnuTest.rtf" using 1:2 title "Gnu Test" w points pt 7, f(x) t "ax^2 + b fit" w lines lc variable
So in theory, I would like to vary the color along the hue scale between the minimum and maximum f(x) values. I've tried putting :(f(x)/STATS_max_y)
or something similar after using 1:2
, but I always get errors like "x not defined" or "; expected". Am I missing something obvious?
Edit: This is how my plot looks following @Christoph's advice. I have set termoption solid
and used lt -1
.
Upvotes: 2
Views: 1168
Reputation: 48390
In order to use a variable linecolor with a function, you must use the '+'
special filename. That allows you to use the using
option, which is required for use with variable linecolor. Note, that this is not related at all to fitting.
Here is a minimal, runnable script:
set palette defined ( 0 0 1 1 , 1 1 1 1 ) color model HSV gamma 1.5
set xrange[0:10]
f(x) = x**2
plot '+' using 1:(f($1)):(f($1)) with lines linecolor palette lw 3 notitle
The result with 4.6.3 is:
Note, that you cannot use maxcolors 1
, as this gives (at least for me) only black.
Upvotes: 3