Friday, October 9, 2015

The Future of Computing


The first iPhone, already obsolete by today's standards.

           When Apple released the first iPhone; it was hailed as a technological marvel because for the first time a computer, camera, music player, email and the Internet could fit in in the palm of your hand and be activated by a seamless touch screen interface. This advancement was made possible by the products, which had come before it. Sure, there were previous MP3 players, point and shoot cameras, and even other cellphones but they were all separate devices and at best other cell phones integrated only one or two of the aforementioned features. The ability to add more features into a smaller package was fueled in part by Moore’s Law and how the individual components keep getting smaller, faster and more energy efficient (Wöhrl). The computing world is changing so rapidly that the electronic landscape looks nothing like it did when the first iPhone was released which was just in 2007!

 Moore's Law

            There is just one problem with the rapid technological growth and expansion that we are experiencing though and that is that it’s unsustainable. Along with the growth rate predictions, Moore’s Law also indicates that advancements will become harder to achieve and the gains will also reduce in significance with each passing achievement (Wikipedia). Most of the initial advancements came from one of two places 1.) Shrinking components or 2.) Materials advancements. But as we run out of new materials to try and we reach a plateau of how small we can reliable make classical computing components we are beginning to hit a wall. Feynman’s lecture on nanotechnology was titled “There’s plenty of room at the bottom,” so what happened?

Richard Feynman, one of the founding fathers of Nanotechnology
 
            The reality is that there is still plenty or room at the bottom, but there isn’t enough room just above the bottom; or at least that space isn’t worth utilizing. As we miniaturize components it is clear that the smallest size transistor possible would be a single atom using its electron spin to store data; a quantum computer (Wöhrl). With that end goal in sight it becomes easier to focus our efforts on achieving this challenge than to continue to working on increasingly marginalized advancements. There is more to why quantum computers are the future than just the reduction in size. Traditional or classic computers use bits to store data in binary code with 1’s and 0’s. Two bits are required to store data in the format 00, 01, 11 and 10 (PD Knowledge). But because of the quantum nature of electrons and spin states, these qubits have the ability to be both-spin up, spin-down and several degrees in-between the two, this is called superposition (Quantum Computing). This makes quantum computers run on an entirely different base math system. An example of how this would impact calculations would be similar to cracking a password composed of letters versus numbers. When using numbers there are only 10 possible options, but with letters there are 26, so naturally guessing a password composed of letters is exponentially harder compared to numbers and the difficulty increases along with the password. Qubits abilities to represent multiple spin states at the same time allows calculations to run more complex calculations and to carry out those calculations in parallel. This means answers can be generated in a fraction of a time compared to classical computers.


Quantum computers can solve difficult problems at high speed

            To represent the benefits of quantum computing over classic computing is the to attempt to crack RSA-2048 Encryption. Cracking RSA-2048 requires finding prime numbers that where multiplied together to equal the final encrypted hash. With a classical such a task would take 1 billion years, but with a quantum computer the task would take a mere 100 seconds (Svore)! The benefits of quantum computers are astronomical in size and show why there is such a push to develop the technology further, but with every new technology there are significant roadblocks. Some of these difficulties include building the chips, storing the data, reading the data back, trapping the spin state electron and not to mention that the entire device must be run at 10 mil-Kelvin which is approximately 100 times colder than interstellar space (DWave Systems)

 DWave Systems quantum computer runs at 10 mil-Kelvin

           Even though quantum computers use nanotechnology and are composed of atoms, the devices are anything but small in size. Current quantum computers take up entire rooms and are vastly expensive to build and operate, but that has always been the history of computer and technology. The Eniac computer from 1946 took up an entire room; it took over 60 years to commercialize, miniaturize and develop that technology until it was portable enough to power your iPhone. With quantum computers we are witnessing the birth of future computing, but it will take time until they are ready to fit into you pocket or to power you iPad.

 Eniac computer 1946


References:

Wöhrl, Nicolas. "Introduction to Quantum Computers." <https://www.youtube.com/watch?v=Fb3gn5GsvRk>. YouTube. YouTube, 8 Nov. 2014. Web. 9 Oct. 2015.

 

PD Knowledge. "Quantum Computer in a Nutshell." <https://www.youtube.com/watch?v=0dXNmbiGPS4>. YouTube. YouTube, 11 Oct. 2014. Web. 9 Oct. 2015.

Svore, Krysta. "Quantum Computing: Transforming the Digital Age." <https://www.youtube.com/watch?v=eUp_B7ZpiXk>. YouTube. YouTube, 9 Jun. 2015. Web. 9 Oct. 2015.

"Quantum computing." Wikipedia. Web. 9 Oct. 2015. <https://en.wikipedia.org/wiki/Quantum_computing>. 
Welcome to the Future” Dwave Systems. Web. 9 Oct. 2015.
<http://www.dwavesys.com/>.

"Moore's Law." Wikipedia. Web. 9 Oct. 2015. <https://en.wikipedia.org/wiki/Moore's_law>.

Saturday, September 26, 2015

The Future of Building


Since the dawn of time man has always been closely associated with the tools we use. Our society and lifestyles are molded and shaped by what and how we build. As humans we are by nature builders and creators. From a manufacturing standpoint there are a few methods that we have used that have become second nature and natural for us to understand. Those methods are Additive Manufacturing (Rapid Prototyping and 3D Printing), Subtractive Manufacturing (Machining, sawing, cutting and milling) and Formative Manufacturing (Thermoforming and injection molding). As we approach the topic of nanotechnology our understanding of building must fundamentally change.




            In the past scientists have tried using either top-down or bottom-up methods of creating nanostructures. These two methods are akin to using the Additive and Subtractive manufacturing processes that we are already familiar with. In bottom-up we start with small individual or simple nanoparticles that are easy to produce and we try to build with them to make more complex larger components (Gitam University). With the top-down method we start with something larger and subtract material away until the shape and size is appropriate to the desired outcome. Both of these methods have their strengths, but they also have their limitations.


These Hydrogen Wave Functions represent some of the unique properties that quantum mechanics have over classical Newtonian physics (Wikipedia).

            One of the reasons nanotechnology is both fascinating and why constructing the structures can be so difficult was explained by George Whitesides in Self-Assembly and Nanostructured Materials. In the intermediate region the region of nanometer-scale structures quantum and classical behaviors mix” (Whitesides 223).  This mixing of Newtonian and Quantum physics makes building structures difficult, but it also makes for enormous opportunities within design. “[…] Because quantum behavior is fundamentally counterintuitive, there is the optimistic expectation that nanostructures and nanostructured materials will found fundamentally new technologies” (Whitesides 223). In order to unlock the full potential of nanostructures and nanotechnology a new method of construction and assembly must be devised.


Self Assembly allows particles to build themselves

            Enter the concept of Self Assembly, a process modeled after the way nature and biology build complex systems (Nimet). With self assembly individual particles that are pre-programed or designed to only connect a certain way are introduced to other particles. Through various methods the particles are dynamically rearranged or mixed and when two particles meet that share appropriate connectors they join or link together (Whitesides 230) . Self Assembly mimics chaos theory as particles and systems naturally go from simple to complex over time. The energy or flux required to facilitate assembly is insignificant and this manufacturing method allows for assembly to be carried out in parallel and on a massive scale which is far more efficient compared to individually placing molecules one at a time (Whitesides 225).   

            What this means is that the future of building won’t look like any of the methods of the past. We won’t be cutting, sawing or welding molecules together to form nanostructures. Instead structures and nanotechnology will be grown, and self assembled by pre-programed material. The future of nanotechnology will look closer to growing bacteria in a petri dish compared to anything we currently see in traditional manufacturing.
Building in the future will mimic how 
bacteria grows and is self assembled
 

References:

Whitesides, George. Jennah Kriebel and Brian Mayers. “Self-Assembly and Nanostructured Materials” Nanostructure Science and Technology. New York: Springer Science + Business, 2005. P217-239. Print.
 
"Medical & Biological Applications" Nimet, Nimet. Web. 9 Oct. 2015. <http://nimet.ufl.edu/nanomed.asp>. 

"Role of Bottom-up and Top-Down approaches in Nano technology" Gitam University, Gitam University. Web. 9 Oct. 2015.
<http://www.gitam.edu/eresource/nano/NANOTECHNOLOGY/role_of_bottomup_and_topdown_a.htm>.

"Quantum Mechanics." Wikipedia. Web. 9 Oct. 2015. <https://en.wikipedia.org/wiki/Quantum_mechanics>.  

N.d. TheDailyMash. Web. 24 Sept. 2015. <http://www.thedailymash.co.uk/news/science-technology/petri-dish-goes-viral-2015011994569>.

Saturday, September 19, 2015

How much room is at the bottom?



Nanotechnology is a growing field with great potential and unrealized capabilities even though the concept has been around for approximately 50+ years. The concept of Nanotechnology was first introduced by scientist Richard Feynman. Within his talk to the American Physical Society, Feynman laid the groundwork for the concept of Nanotechnology, even though he never actually used the term Nanotechnology. To this day he is still considered the father of the field based off of the questions and possibilities he posed to his fellow scientists in his talk “There’s Plenty of Room at The Bottom.”

IBM created a movie made entirely from atoms, proving Feynman's 
prediction that we could manipulate atomic structures. (USA Today)

            Within his talk, Feynman covered several topics ranging from how to write small, information on a small scale, miniaturizing the computer and building small robots and factories. Some of these concepts have made it into the overall goals of Nanotechnology, while others have been rendered obsolete by advancements in computing power and storage media. Feynman proposed storing mass amounts of data in small encoded bits, but the method he suggested has a drawback in that the more data you store the more area it takes up (even though it is on the Nano scale). With advancements in storage devices such as USB drives and external Hard Drives data can be stored in a way that can much easier be generated, read and edited compared to the physical storage means Feynman discussed even if they are not on the Nano scale. So ultimately that goal of his talk was short lived even if theoretically possible.

Feynman's vision of nanotechnology is several thousand 
times smaller than even this tiny motorized car. (japanesenostalgiccar.com)

            Another still relevant goal of Feynman’s talk was to miniaturize robotic arms, factories and machines such as lathes. There are several problems that he addresses as a result of miniaturization, such as the scaling of electronics, magnetic forces, tolerancing difficulties, Van Der Waals Molecular Attractions and lubrication.  In short, it is not just about making things smaller, but about fully redesigning and rethinking things as they approach a Nano scale. Objects that work at full size will not function properly scaled down and this has been the biggest stumbling block of Nanotechnology to date.

            Currently Nanotechnology remains in an infancy stage and has huge untapped potential. It has been over 50 years since Feynman gave his groundbreaking talk, but still there has been little advancement over they years. Sure, electron microscopes have gotten better as Feynman predicted, but we are still failing to produce effective working devices on a small scale. Nanotechnology has been used as a new form of materials application, which is exciting, but where are the microscopic robots, or the ground breaking small devices? Was Feynman wrong in his assumptions that we could overcome some of the issues he laid out in his talk? Have we already hit the bottom, or is there just not enough desire, motivation or vision to build on the Nanoscale?

References:
Feynman, Richard. “There’s Plenty of Room at The Bottom.” Caltech Engineering and Science, Volume 23:5, February 1960, pp 22-36. Print.

"The World’s Smallest Car is a Toyota AA." japanesenostalgiccar.com. japanesenostalgiccar.com, 4 Apr 2011. Web. 9 Oct 2015. <http://japanesenostalgiccar.com/2011/04/12/the-worlds-smallest-car-is-a-toyota-aa/>

"Miniature movie carries a lot of (atomic) weight." USA Today. USA Today , 1 May 2013. Web. 9 Oct 2015. <http://www.usatoday.com/story/tech/sciencefair/2013/05/01/boy-atom-movie/2124075/>

IBM. "A Boy And His Atom: The World's Smallest Movie." <https://www.youtube.com/watch?v=xA4QWwaweWA>. YouTube. YouTube, 30 Apr. 2013. Web. 9 Oct. 2015.