We take them for granted now, but think back to the time before we had computers. Hell, just think back a decade to when you had to sit there for five minutes listening to various bleeps and bloops while your modem tried desperately to connect you to AOL. You were never quite sure when the clock started ticking down on your 100 hours of Internet for the month. Things have definitely changed. For the better, I'd argue.
Gordon Moore's 1965 law, which states that the processing power of computers will double about once every two years, has proven remarkably accurate for more than forty years now. But, if we keep moving at the same speed and with many of the same components, we're going to max out in only another twelve years or so. That's when experts predict microchips will reach their size limit. The electrons necessary to represent data won't be able to fit on our circuits anymore, they believe.
With that in mind, OnlineComputerScienceDegree.com has created an infographic that serves as a new timeline of the computer's history. Bonus: Feel free to sing the large words that run the length of the image out loud to the tune of R.E.M.'s "It's the End of the World as We Know It (And I Feel Fine)."