Skip to main content

Help, I have a question and I need an answer

Comments

1 comment

  • David Moller

    Trying to get color proof to more accurately simulate pantone spots. I started going though the process of optimizing the spots via spot color editor, and noticed something strange. First thing was that almost all pantone spots seems to be about 4-5 DE off from where the first iteration is, to after its optimized - doesn't matter how out of gamut the pantone is. So my first scan is say 7 DE, I do 2-3 optimizations in spot color, I get it down to 2 DE. Or something way out there is initially DE 30, and I get it down to maybe 22. Is that normal?

    Next, so I go through the motions, and my final scan says my DE is 0.5 for say cool gray 2c. Yet, when I look at the chip versus the color proof, I do notice it off slightly. Granted I am not viewing them in perfect conditions in a light booth, but it is in an area with 5000k lighting. Being the DE at 0.5, I would imagine I really shouldn't notice any variation. I measured the actual chip also, thinking maybe there was something off in the target values for some reason, but that wasn't the case - target LAB value for the pantone measured was pretty much the same from what was in the system. So if my DE is under 3, and this is well under 3, why is there a noticeable difference between the two?

    0

Please sign in to leave a comment.