ChumpusRex2 wrote:I think it depends on what you are trying to achieve.
Lumens are already adjusted for retinal spectral sensitivity, for photopic (normal) vision. Thus lumens are a good measure of luminous output under most circumstances. By extension, the lumens/Watt is a useful measure of efficacy.
The value of scotopic luminous intensity, is only useful for very-low light intensity. Typically, illumination levels of 1 lux or less fall into scotopic vision (night vision) - this is roughly equal to the light intensity, outside at night at half-moon.
For such situations, e.g. low-power flash lights the scotopic luminous efficiacy is more useful.
However, this level of lighting is useless for a domestic setting - you have no color vision, nor central vision at this light intensity.
True enough.
However, for some specific applications LEDs are terrific.
For example, say for supplemental lighting for a greenhouse. It's quite easy to put together an array of LEDs that emit light in wavelengths that vegetables thrive on (PAR....400 to 700 nm). Tomatoes don't even see green, yellow and other wavelengths of light our human eyes do so why spend the energy on them?
Add that these arrays use only a small fraction of kwh's that coventional lighting systems do and emit far less wasted and dangerous heat, last dramatically longer and......well, they've got a niche.
Of course, if one buys these LED arrays retail they're "very" spendy. But for the many "can do" folks here at PO forums, they're hard to beat.