Is Technology Moving too fast? Evolution of technology and inventions.

Hey! Guys welcome to another awesome article, where we take interesting ideas related to data, technology, business,and careers then turn them into unique and intriguing content for your enjoyment. So, without delay, let’s get started. Nowadays, it feels like every week anotherground-breaking invention or idea is revealed. But how did we get here? 

In this article we’re going to look at theevolution of technology from the very beginning. We’ll go through some of the most influentialdevices and concepts that have led us to the techno-fuelled world we live in today. Our time travel journey starts pretty muchwhen we did, before selfies, trolling, open-world gaming, and cheesy infographic videos, wayback in 35,000 BC with the first recorded example of counting. It all began with a simple tally on a bone– the Lebombo bone. Talk about upcycling! More conveniently, Papyrus paper was inventedin 3,000 BC for recording and the Abacus in 2,300 BC for sums. 

                                                      These had evolutions of their own for thenext couple of millennia up to around 100 CE where they reached a state of how we knowthem today. This was also the time Hindu Arabic numeralswere developed, not the first characterised numbers but ones which allow complex mathematics. Jumping to the second millennia we have thedevelopment of the Torquetum – a complex analogue computer used to measure astronomical coordinates and is the bases for all modern astronomical instruments. The Torquetum was the first device used forthe observation of Hayleys Comet. Equally terrifying and amazing In 1206 cameone of the first examples of the concept of automation with Al-Jazari’s Programmableautomata which helped to fuel the ideas of mechanical humans and artificial intelligence. 

                                        Recently, evidence was discovered showingthat the first mechanical calculator was conceived in 1502 by none other than Leonardo da Vinci– who apparently invented everything. However, nothing came of it and after a fewmanual calculation systems, the first mechanical calculator was realised as Pascal’s Calculatorin 1642. Shortly after, we got the binary system. But It wasn’t until the 1800s that technologywas advanced enough to take advantage of it. Starting with the Jacquard Loom, which usedpunch cards to automatically weave designs, then leading to Babbage’s concept of thedifference engine and later the analytical engine. Unfortunately, these last two projects weren’tactualised due to funding but that didn’t stop the world’s first computer programmerAda Lovelace from seeing how they could work and writing the first ever algorithm. Then we literally captured lightning in abottle and harnessed electricity, we performed the census with a tabulating machine for thefirst time, the triode was invented – essential for the development of television and radio,we came up with the idea of a thinking machine – the first ever robot in the sci-fi film“Metropolis”, and the Turing machine was conceptualised. All this played a huge part in the creationof the first ever electric computer in 1939 – The Atanasoff-Berry Computer. Work on the Atanasoff-Berry was discontinueddue to World War II, but other computers built for the war effort, like the colossus and Zuse Z3, were developed. 

                              During the war a model for computational neuralnetworks was created. After the war had ended we got the ENIAC - thegrandfather of digital computing, the UNIVAC for business and government, the transistor,and the first thinking machines in the SNARC and the IBM 701. In fact, this was the first computer thatdisplayed AI capabilities as it learned to play checkers. The 50s saw another step forward in machinelearning with the Logic Theorist – a program designed to mimic the grammar skills of ahuman and dubbed the first AI program. This decade also sees the integrated circuitwhich in the 60s was starting to be used in digital computers. 

                           This gives us the first ever minicomputer– the PDP 1 which, in turn, creates the need for the mouse and graphical user interface. We also got Eliza, the first ever Chatbot. The end of the 60s saw programmable calculators,super computers, operating systems and the introduction of ARPANET, the internet’shumbler and less complex older sibling, and of course the Apollo guidance computer. The 1970s saw these technologies become commercialand with that came a flood of new developments – the Honeywell 316, Canon Pocketronic,the first DRAM and microprocessors, the iconic floppy disk, the first humanoid robot, andthe first commercially available microcomputer. 

The decade saw plenty of improvements to thetechnology including Microsoft’s first programming language, and IBM’s and Apple’s originalentries into desktop computing. The end of the 70’s saw the first automatedvehicle – the Stanford cart – and led us into the 80’s where technology was quicklygetting smaller, cheaper, and more powerful. We also find the development of backpropagationand the CD ROM. During this time, tech is becoming so compactthat the Galvilan SC is marketed as a ‘laptop’ computer. AI is also making leaps and bounds with Cyc,a project aiming to build a knowledge base of basic concepts, ‘rules of thumb’ andcommon sense to assist AI applications perform human-like thinking. In the following 5 years we get handheld computers,AI that teaches itself to speak, the first versions of windows and excel, plenty of supercomputers,laptops, desktops, processors, and in 1989 we get the world wide web. The web and some big leaps in processing powerin the early 90s brought PCs into the realm of entertainment devices. This decade introduced us to PC gaming, DVDs,internet browsers. We got some of the first mobile phone devicesfrom Nokia to fit in our hands and super computers that are now able to beat human master chesschampions with Deep Blue. And let’s not forget Skynet, I mean Google. For the next decade we see the developmentof various computers, hard drives, robots, processors, and the still in use USB. In the mid 2000’s we get our first 64-bitprocessor and dual core CPU, a huge jump in speed and power of computing. Speaking of speed, in 2002 we got the firstcommercial Maglev – short for magnetic levitation train. The world is also introduced to YouTube andthus the innate human need for on-demand cat videos was satisfied. With the huge growth and commercialisationof technology in the last decade or two, data is everywhere. That’s why in 2005 the term Big data iscoined. Apple shows its power in the tech field withthe release of the MacBook Pro, while the iPhone brings smartphones to the mainstream. In the late 2000s IBM shows off its dominancein the realm of the supercomputer by having the fastest in the world with the blue genethen bettering it themselves with the roadrunner. Block chain arrives in force in 2008 to shakeup the financial sector among others. China and Japan start to push back IBM ina battle for who has the most powerful super computer, with the Tianhe and K computer,respectively.

 This leads to some of the biggest jumps incomputing power. We also see another show of AIs growth withSiri and Watson, who wins Jeopardy against human opponents. And it’s not just on earth where thingsare getting more advanced with the Mars Rover showing the rest of the universe what we’recapable of. To exemplify how far we’ve come to thispoint, in 2012 and 2013 we have supercomputers that reach exascale speed and on our wristswe have computational devices millions of times faster than the computers that senta rocket to the moon. And did I mention now that our rockets havethe capability to land themselves – with a lot more computing power than a watch, mindyou. By 2016 we’ve officially reached sci-fiterritory with self-aware Sophia, 3-D printed prosthetic limbs, and realistic virtual reality. At this point we’re looking at molecularcomputing – because humanity love making small tech… especially things like almostpaper-thin laptops with incredibly fast processors and cancer-fighting nanobots. And because we also like to show off, we havethe fastest supercomputer at 200 petaflops with summit. New grounds of travel and AI combined withthe model S. And Atlas who puts many of to shame with his precision parkour. We really are in amazing times. But where do you think we’ll be headingnext? What will be the next big thing? Are we maybe going too fast, our ancestorshad much more time to process and learn from new developments. We’d like to open up a discussion in thecomments and get your thoughts on the future of technology.

Thanks for Reading.

Comments

If you have any doubts. Let me know.

Archive

Contact form

Send