In an industry in which development is so rapid, it is somewhat surprising that the technology behind monitors and televisions is over a hundred years old. Whilst confusion surrounds the precise origins of the cathode-ray tube, or CRT, it’s generally agreed that German scientist Karl Ferdinand Braun developed the first controllable CRT in 1897, when he added alternating voltages to the device to enable it to send controlled streams of electrons from one end of the tube to the other. However, it wasn’t until the late 1940s that CRTs were used in the first television sets. Although the CRTs found in modern day monitors have undergone modifications to improve picture quality, they still follow the same basic principles.
The demise of the CRT monitor as a desktop PC peripheral had been long predicted, and not without good reason:
- they’re heavy and bulky
- they’re power hungry – typically 150W for a 17in monitor
- their high-voltage electric field, high- and low frequency magnetic fields and x-ray radiation have proven to be harmful to humans in the past
- they cannot be used in laptops
- the scanning technology they employ makes flickering unavoidable, causing eye strain and fatigue
- their susceptibility to electro-magnetic fields makes them vulnerable in military environments
- their surface is often either spherical or cylindrical, with the result that straight lines do not appear straight at the edges.
Whilst competing technologies – such as LCDs and PDPs had established themselves in specialist areas, there are several good reasons to explain why the CRT was able to maintain its dominance in the PC monitor market into the new millennium:
- phosphors have been developed over a long period of time, to the point where they offer excellent color saturation at the very small particle size required by high-resolution displays
- the fact that phosphors emit light in all directions means that viewing angles of close to 180 degrees are possible
- since an electron current can be focused to a small spot, CRTs can deliver peak luminances as high as 1000 Candela/m2 (or 1000 nits)
- CRTs use a simple and mature technology and can therefore be manufactured inexpensively in many industrialized countries
- whilst the gap is getting smaller all the time, they remain significantly cheaper than alternative display technologies.
However, by 2001 the writing was clearly on the wall and the CRT’s long period of dominance appeared finally to be coming to an end. In the summer of that year Philips Electronics – the world’s largest CRT manufacturer – had agreed to merge its business with that of rival LG Electronics, Apple had begun shipping all its systems with LCD monitors and Hitachi had closed its $500m-a-year CRT operation, proclaiming that “there are no prospects for growth of the monitor CRT market”. Having peaked at a high of approaching $20 billion in 1999, revenues from CRT monitor sales were forecast to plunge to about half that figure by 2007.
- The Anatomy of a CRT Monitor (and CRT TVs)
- CRT Monitor Resolution and Refresh Rates (VSF)
- Monitor Interlacing
- What is the Dot Pitch of a Computer Monitor
- Dot Trio Monitors
- Grill Aperture Monitors
- Monitor Technologies: Slotted Mask
- Enhanced Dot Pitch Monitors
- Electron Beam Monitors
- Monitor Controls
- The Different Types of CRT Monitors – From ShortNeck to FST
- What is a Digital CRT Monitor and How Does It Work
- What is LightFrame Technology?
- Safety Standards For Computer Monitors
- TCO Monitor Standards
- Monitor Ergonomics
PC Components | Processors (CPUs) | PC Data Storage | PC Multimedia | PC Input/Output | Communications | Mobile Computing