Survey: Employers of outdoor workers should provide sunscreen
Dermatologists strongly recommend the use of sunscreen to reduce exposure to the Ultraviolet (UV) radiation that can cause skin cancer, the most common form of cancer in the U.S.
When that exposure occurs on the job, 74 percent of Americans believe businesses with outdoor workers should provide sunscreen for their employees to use while at work – according to a study commissioned by Deb Group, a company that offers a professional range of UV Protection creams. However, 71 percent of outdoor workers are not provided sunscreen by their employers to use at work.[1]