Chances are, you are using a computer right now, to read these words. And even if you’re reading them printed, a computer was necessary to build this HTML, send it away from this server, and parse it on the client’s side. Computers are everywhere. Everything from a giant factory to a little cell phone is run by computers. We use them every day. We use them to calculate, to communicate, to control, and to entertain. I don’t really have to tell you this, everyone knows what a computer is. But how many people ever stop to think just how much science had been put into computers?
The most crucial component of a computer is, of course, the central processing unit, the CPU, which is an integrated circuit printed on a thin wafer of silicon. Inorganic chemistry told us how to obtain, purify, and modify the semiconductors that were necessary, but physics, a whole new branch of physics developed in the last fifty years, tells us what to make, and how those microscopic deposits of metals, oxides, and semiconductors can become diodes, transistors, resistors and capacitors, which make up logic gates, triggers and other higher-order elements of the integrated circuits.
Another major component of every today’s hard drive is a hard disk, a fast permanent storage facility of tremendous size, often terabytes of data in a single drive. It is a far development of the old physical phenomenon, metal shavings aligning along the magnetic field lines. A modern hard drive has rapidly spinning platters, covered in thin magnetic films, developed by the decades of research by the inorganic chemists. The magnetic read heads that hover above the plates at microscopic distances are covered in complex multilayer film structures which utilize a quantum mechanical effect known as giant magnetoresistance (discovered in 1998, awarded Nobel Prize in 2007), essentially a dramatic drop in electrical resistance in presence of a magnetic field, which permits high-density data recording and playback.
Now don’t forget the monitor, the part you’re looking at right now! Until recently, they were all cathode ray tubes, which were essentially linear particle accelerators which used high voltage to generate an electron beam (physics!) which flew threw vacuum, focused by computer-controlled magnetic fields, and hit chromophoric chemicals (chemistry!) on the inside of a glass pane. Now, the display technology progressed and diversified, and covers everything from projectors to the huge plasma screens, thousands of tiny glass cells filled with ionized noble gases and the same chromophoric coating as the CRTs, (produced since 1996) to the now-ubiquitous liquid crystal displays (first produced in 1972) where molecules of a nematic liquid crystal twist to align to an applied electric field, rotating the polarisation of light 90 degrees. When placed between two polarizing filters aligned 90 degrees in respect to each other, this crystal is opaque in absence of electric field and transparent when subjected to it. Each pixel is controlled by its own transistor, created right on the glass surface using thin film (TFT) technology, and the transparent indium-tin oxide films are used for conductive coating. The liquid crystals themselves are complex molecules, belonging to the domain of organic chemistry. The films and the coatings come from inorganic chemistry, and, again, physics binds everything together into a marvel of engineering found on everybody’s desk.
One more science that sometimes gets unfairly omitted when discussing the complexity of computers is, ironically, the computer science. Without software, the computer is only a dead assembly of wires and semiconductors. Software is the product of software engineering, which applies the concepts and the theorems of computer science to the real world, and does it with a great degree of success. Even taking into account the endless bugs and breakdowns, the computers, in general, do what the people tell them to do, and people often bet their lives on that.
There’s a great deal more to be said about computers, and how the entire humankind has changed because of them. They’ve become as ubiquitous as electricity, which also changed the life of every single person back in the end of the 19th century. More and more people take them for granted and forget the century of scientific knowledge, distilled and concentrated at their fingertips.