Yahoo Groups archive

QTR-Quadtone RIP

Index last updated: 2026-04-28 23:12 UTC

Message

Re: linearization questions

2005-05-04 by ccolbertbw

Hi Roy, Tyler, Daniel,

Just some thoughts about linearization, etc.

I have generally had very good luck with both quad inksets and more
recently UC with the Paul Roarks PKN and LKN. While maintaning the
ability to print color I can get a neutral BW with minimal toning 
by the LC and LM.

It has always amazed me just how well the single linearization does
given that there are many nonlinearities that the process would seem to
ignore.  In general, it is just great.

For some reason the PKN on Luster paper builds up density in a very
nonlinear manner. The density curve is very steep, then flattens 
out. But you don't want to ignore that top part of the curve by setting
an early ink limit because it means the difference between a dmax of 
2.0  vs 2.3.  The LKN has a much more gradual slope throughout. As 
such, when linearization scales down the output values,
which it must do strongly for PKN, the effect on density contributed
by the different inks is not proportional. Thus, there is some error in the
final density.  For this situation it would be nice to linearize 
this single ink before any partitioning is done. (This is my 
understanding, consistent with experience-please correct me if I 
am wrong).

Perhaps more of a challenge is that density gets really wacky at the
dark end for many inks, papers, and printers. I have had a lot of trouble
with getting  good uniform separation above 95%. Linearization as it 
stands is implemented by fitting a curve to the whole density function, 
then scaling back the output values to fit an "ideal"  (perceptually uniform) 
curve.  Unfortunately, the heavy ink loads refuse to behave nicely, making 
it very hard to get a good curve fit.  Wiggles in the curves result. For this 
reason using a 21 step linearization  often produces a smoother curve than
 the 51 step. A second linearization at the end would  help a bit here, but 
would not  take out isolated bumps in density.

I have been  thinking that an approach would be to make and measure a 
test chart with  many dense values from say 95-100% made up of
different output values of the darkest inks. A program could then build 
up the curve that  best  produces the desired densities rather than trying
to deduce the closest single scaling factor. I'll do this if I can find the time.
 
Short of this, experimenting with the shadows and highlights helps a
huge amount. I just find the trial and error of printing many curves with
different values to get a less bumpy starting point more time and effort
than if we just had an (optionally) more involved linearization process.

just my two cents.

Costa Colbert

--- In QuadtoneRIP@yahoogroups.com, "Roy Harrington" <roy@h...> wrote:
> 
> Hi Tyler,
> 
> As things stand Linearization is a one-shot operation.  I.e. you
always apply it
> to the same "raw" output. However the gamma and (highlight/shadow)
values
> are applied first so you can get closer to linear before getting to
the final
> linearization.  These help a whole lot because the 21 steps are
more evenly
> spaced out.   
> 
> I've thought about allowing iterative linearizations or possibly
linearizations
> at other levels i.e. gray or toner linearizations.  How valuable do
you think
> this would be versus how more complicated it would be?
> 
> Roy
>

Attachments

Move to quarantaine

This moves the raw source file on disk only. The archive index is not changed automatically, so you still need to run a manual refresh afterward.