INTRODUCTION
If you are an Electronics Technician who is beyond the age of 45, maybe, you're are familiar with these words - stereo multiplex, PLL synthesized tuner, main amp, pre-amp, reverb, mixer, Walkmann, Watchmann, Betamax, Betacord, SP, LP, EP, Metal Oxide Tape, Marantz, Sansui, Teac, RCA and among other electronics jargons and terminologies of the 80's. Welcome - this article is for you.
Remember the days when Electronics was still all about AM, FM, NTSC, PAL and Secam? The time that when it was easy to create a 'project kit' for a power amp or just simple 'Morse code practice switcher'? Those were the days when computers with those DOS commands were seemed to be too crude that they were not so appealing to most Electronics guys (no offense but that was my experience), with those so painstaking and tedious ways of running a program, added to that burden, that you need to memorize lots of command 'formats' and create flowcharts before testing them if that program can make your computer 'run'.
But how come that now-a-days, things seem to change? In 1990s, the conventional ways of Electronics repair slowly started to fade away and slowly replaced by 'microchips' and SMD ICs that cannot be desoldered out by using the old reliable soldering iron and desoldering tool (vacuum-sucking, sold-a-pult plastic tool). One has to learn how to use the 'hot air' device just to pull out the spider-look 'Surface Mounted' IC and if you're not good enough to determine how hot the air needed to pull out that IC, unfortunately, you might damage the board or those heat-sensitive parts, adjacent to your target object. Electronics seems become, no longer 'Technician friendly'. Lately, things became even worse when PCB with electronics components on it become too microscopic (sorry for the exaggeration but when things got this way, I began to wear eye-glasses and my vision began to deteriorate). I learned that these 'electronics cards' (no longer called electronics boards), are not just 'double-sided', meaning, there are microscopic electronics chips on both surfaces, the PCB (printed circuit board),in a way, is also either double or triple layered that the old way of ' tracing the circuitry', a traditional way of troubleshooting without a schematic diagram (ask the expert old timer electronics repairmen), would be an 'impossible' option. Blame those Electronics engineers and designers who put us into 'mediocrity'.
FLASHBACK
Historically speaking, the idea about ‘digital theory’ was already been written in the textbooks, maybe as early as 1960s (I guess so). I know it because we still have that old book, still sitting in my bookshelf. But it wasn’t been that explosive, as of today. The thing is, I never gave much appreciation as to the importance of digital electronics during that time. Maybe, the most important question of the day is, why the need to change the technology?
If you are an Electronics Technician who is beyond the age of 45, maybe, you're are familiar with these words - stereo multiplex, PLL synthesized tuner, main amp, pre-amp, reverb, mixer, Walkmann, Watchmann, Betamax, Betacord, SP, LP, EP, Metal Oxide Tape, Marantz, Sansui, Teac, RCA and among other electronics jargons and terminologies of the 80's. Welcome - this article is for you.
Remember the days when Electronics was still all about AM, FM, NTSC, PAL and Secam? The time that when it was easy to create a 'project kit' for a power amp or just simple 'Morse code practice switcher'? Those were the days when computers with those DOS commands were seemed to be too crude that they were not so appealing to most Electronics guys (no offense but that was my experience), with those so painstaking and tedious ways of running a program, added to that burden, that you need to memorize lots of command 'formats' and create flowcharts before testing them if that program can make your computer 'run'.
But how come that now-a-days, things seem to change? In 1990s, the conventional ways of Electronics repair slowly started to fade away and slowly replaced by 'microchips' and SMD ICs that cannot be desoldered out by using the old reliable soldering iron and desoldering tool (vacuum-sucking, sold-a-pult plastic tool). One has to learn how to use the 'hot air' device just to pull out the spider-look 'Surface Mounted' IC and if you're not good enough to determine how hot the air needed to pull out that IC, unfortunately, you might damage the board or those heat-sensitive parts, adjacent to your target object. Electronics seems become, no longer 'Technician friendly'. Lately, things became even worse when PCB with electronics components on it become too microscopic (sorry for the exaggeration but when things got this way, I began to wear eye-glasses and my vision began to deteriorate). I learned that these 'electronics cards' (no longer called electronics boards), are not just 'double-sided', meaning, there are microscopic electronics chips on both surfaces, the PCB (printed circuit board),in a way, is also either double or triple layered that the old way of ' tracing the circuitry', a traditional way of troubleshooting without a schematic diagram (ask the expert old timer electronics repairmen), would be an 'impossible' option. Blame those Electronics engineers and designers who put us into 'mediocrity'.
FLASHBACK
Historically speaking, the idea about ‘digital theory’ was already been written in the textbooks, maybe as early as 1960s (I guess so). I know it because we still have that old book, still sitting in my bookshelf. But it wasn’t been that explosive, as of today. The thing is, I never gave much appreciation as to the importance of digital electronics during that time. Maybe, the most important question of the day is, why the need to change the technology?
Electronics plays a big role in our lives, not only in entertainment but also, in the field of medicine, industrial automation, military, and of course in communication (not to mention, the other less popular applications). The ‘AM radio’ maybe the first electronics ‘legacy gadget’ that hit the town. Then came the television set. I remember my old folks saying that during the early days (in a third world country like mine), the tv programs were ‘aired’ in ‘live broadcasts’ and there were no such things yet as ‘replayed telecasts’ or ‘delayed telecasts’. The only known machines that capable of recording and playing, ‘things in motions’, were the moving cameras and film projectors. Music recordings were already been around and people were able to enjoy it by playing their favorite songs through a ‘phonograph’. Then came, what was known as ‘magnetic tapes’.
The ‘Magnetic Tape’ Technology became ‘dominant’ and enjoyed a much longer period of time as a technology to reckon (1960 - 1985?). The first magnetic tape technology that became commercially popular was the “audio reel” type (maybe, it was in late 60’s). TEAC became then, a household ‘brand name’ when it comes to quadrosonic, hi-fi music. But ‘reel tapes’ were bulky and difficult to use, especially when you wish to change a new set of ‘reels’, to listen to another set of music. The only A+ factor of reel tapes was that, they can play up to three hours of continuous, real high-fidelity music. Then, the ‘eight tracks’ came out. But it was short lived, because it wasn’t that ‘handy’ as to its counterpart – the ‘audio cassette tapes’. When it comes to the quality of sound that it can reproduce, cassette tapes were less as efficient as to the ‘reel tapes’ or first class ‘phono records’ (or even to the ‘eight tracks’), but the selling point of cassette tapes were their ‘compactness and ruggedness’. That was the time that people started to enjoy recording their own voice and hear them ‘played back’ (popularly known as ‘voice recording’).
The magnetic tape technology for ‘capturing moving things’ became a bit late. Maybe it was in the 70’s that ‘U-MATIC‘ video magnetic tape recorders, came out in the picture. It was first used by broadcast studios as ‘professional’ equipments, exclusive only for ‘video taping’ of their tv programs. In the 80’s, the first commercial video tape recorder that hit the world was Sony’s BETAMAX. Europe and North America, later on, created their own version – the VHS format. VHS format was adopted by most countries worldwide (even by Japan), that it became the ‘universal standard’. Then, the Laser Technology, came its turn.
In the early days, L.A.S.E.R. along with S.C.U.B.A., were just popular acronyms that kids in schools like me, kept on memorizing, simply to impress other people (Take notes, L.A.S.E.R. - Light Amplification by Stimulated Emission of Radiation while S.C.U.B.A. – Self-Contained Underwater Breathing Apparatus). And then, came a time, that ‘laser gun’ became a sci-fi’s futuristic weapon in the 'black and white' movies.
Never in the minds of ordinary people that it could be applied to replace the diamond stylus (the tone arm’s cartridge’s needle), of a high-tech kind of turntable, without a platter on it - with a shiny, compact and silvery plastic record much smaller than the 45’s that you put into that freak turntable and spin at very high speed. This is what is now known as a ‘CD player’. CD player was the first true digital device that plays digitally stored information burned into its CD (that circular, shiny and silvery plastic ‘thing’ you put into the CD player to hear music).
Its video counterpart, the Laser Disc Player, uses a CD record as big as of the old 33 (remember the LP records?), by which, both surfaces (having side A and side B), are use to store video and audio information (although, still in analog form). The ‘Laser Disc’ (before BluRay), has the best quality of picture and sound as a medium to play HD movies or video concerts but the LD records cost very expensive. CD replaced the ‘audio cassette tapes’ as its sound reproduction is so superb compared to any high quality type of audio cassette tape. But its video counterpart, LD, didn’t last long. The cost of production of the LD records, were too high and only a handful of rich people could afford to buy them, not to mention the LD player which also costs a lot to own one. But the introduction of CD institutionalized the application of the ‘laser beam’ and new developments in Laser Technology began. Now we have VCD, DVD and the BluRay, not to mention the other dozens of less popular versions.
About 'Personal Computers'
The 1980’s can be considered the golden era of Analog Consumer Electronics Products. It was the time when Japan dominated the world of Electronics when it comes to color television sets (SONY was no. 1). It was also the period of ‘Stereo Components System’ – individual layers for tuner receiver, reverb, equalizer, tape deck, CD component, pre-amp and main amp (Sansui, Marantz and Pioneer were the brands to consider), sitting on top of the others. Also a separate ‘Speakers System’ was also became a trend, to enjoy crispy and high volume sounds. (Bose and JBL were among the leaders when it comes to speakers system).
During that time, most ordinary people considered computers to be as ‘corporate equipments’- use only for electronic data processing. Only few people, who were specializing in learning computer programming, took that opportunity to have jobs in the corporate companies, as computer programmers (not yet as IT experts). In Telecom, the use of computers were gradually been considered, slowly replacing the analog equipments, as new theories about digital electronics became realistic and applicable (meaning, some of the problems in implementing digital designs and architectures were been solved).
IBM was the first computer company that released in the market, a set of unique kinds of ‘computers’, as personalized, commercial products. The PC, PC-XT and PC-NT were the first batch of ‘personal computers’, which main purpose was for ‘home’ use. But this set of computers was not so appealing to the majority of people because of the difficulty of running programs. One needs to learn B.A.S.I.C., to at least make his or her ‘personal computer’, functional. Then, Apple introduced ‘MAC’ which is, more easier to use – with graphic icons and a mouse that can ‘point and click’ any of the colorful icons and the user can then begin, doing a computer task. This breakthrough in the design of computers, change dramatically, the attitude of majority of people towards personal computers.
Tracing The Digital Fingerprints
Fundamentally speaking, both Computer Science and Digital Technology are products of the developments that went through in the field of Electronics.
The development of Digital Technology could be traced way back from the time when man learned to use ‘raw electricity as a medium of communication’. The ‘electric Morse code telegraph’, in my own opinion, can be considered as the very “first form of digital communication system”, the crudest way and the slowest way of transmitting binary signals (based on Morse code’s ‘dot’ and ‘dash’), through switching on abruptly an electric current. It was also the very first type of a long-distance communication, transmitting electric signals over miles of wires from one place to another and later, as the first form of wireless communication (through wireless telegraph).
TELEGRAPH TECHNOLOGY
But the accuracy of sending and translating telegraph messages depends also on the accuracy of the operators themselves. It is ‘human nature’ that pushed this first batch of ‘electrical geniuses’ (not yet 'officially' called Electronics), to find ways to make things ‘easy’. The first innovation that went through was to send eight telegraph messages simultaneously in a single wire (multiplexing), saving the cost of installing multiple bundles of long wires that run miles away from one location into another.
The second plan was to ‘avoid’ human operator and to find ways to make the transmitting and translation (never yet been called ‘encoding’ and ‘decoding’), of the Morse code ‘automatic’, meaning, without the manual way of translating the code via human operator. So, again, it is ‘human nature’ to make things ‘automatic’. The next batch of ‘inventors’ created telegraph machines that were far different from the original system of transmitting the dot and dash codes originally created by Samuel Morse.
Instead of sending a printed dot and dash codes on a paper tape, pioneers of the what-so-called ‘teletypewriters’ and ‘teleprinters’, designed different kinds of electromechanical telegraphy machines (designs were kept on improving, from R. E. House, D.E Hughes, E. Baudot, F. Creed and among others,*see Wikipedia, “Teleprinters”), connected by long wires to a counterpart remote teleprinter, that is miles away. A typewriter operator, simply, ‘do the typing’ - hitting the corresponding alphanumeric keys, as to how the actual message is written, instead of a Morse code operator that needs to accurately hit the ‘telegraph key’, to send the correct message. Electric signals will then be sent through the wires. The receiving electromechanical machine (teleprinter), then ‘automatically’ do the typing, without a second Morse code operator at the receiving end that translate the incoming message. Again, the methods of sending electric signals were given ‘much care’ so that a message could be sent correctly. But in the receiving line, this is not a simple task. Relays, solenoid plungers, wheeled gears and rotating motors where used to precisely position the correct ‘type bar’, to imprint the ‘corresponding letter’ into the paper tape. So, a different form of ‘counting, timing and sending of series of abrupt electric signals’ was studied intensively, to produce the best method of "electric signal coding", unique for this second generation of ‘Telegraph Technology’.
As Electronics developed, the next generations of 'telegraph machines' further improved. Initially,’ vacuum tubes’ replaced, possibly 'half' of the mechanical parts of the early and purely ‘electromechanical’ telegraph machines. Then, when solid state became a trend, transistorized versions appeared, resulting into a much smaller sizes. IC’s or Integrated Components enabled designers to further improve the flexibility and functions of the ‘modern electronic telegraph machines’. 'Telex', a technology of sending telegraph messages via 'circuit switching' similar to telephone technology was also been implemented. But in the other fields of technologies, parallel inventions were been developed to replicate the fundamental operations (or purpose) of telegraph machines. Fax, inkjet printers, and early corporate computers for 'Electronics Data Processing' began to appear and gained popularity. 'Western Union', a U.S. company that dominated the telegraph technology since 1850 and the 'telegram business' for more than a century, announced in 2006* that it would no longer continue its 'telegram business' as e-mails and internet came into the picture.(*source: SearchUnifiedCommunication,"telegraph").
It is important to mention at this time that Digital Technology evolved initially, more on developing ways of sending ‘readable’ messages before graphics, picture images, audio or video contents developed (what IT Experts now called ‘Multimedia Contents’). In fact, big corporations worldwide, originally, relied mostly on ‘computers’ that processes their files and documents related to business works, to ease up their work load. IBM became a popular trademark when it comes to ‘business computers’ although other brands also competed. It is in this field where Digital Electronics started to become fully appreciated.
START OF COMPUTING TECHNOLOGY
Aside from making things ‘easy’ and ‘automatic’, it is also human nature for people to want things to have ‘instant results’. In his attempts to develop methods to speed up manipulating numbers, man first learned ‘how to count’, using the fingers of his bare hands. Later, he devised tools to help him do basic arithmetic – as for most practical reason, the use of ‘pen and paper’, to manually add, subtract, multiply and divide numbers. The Chinese and Japanese people, depended on the use of ‘abacus’ (soroban in Japan), for quick computations. In Western part of the world, mechanical machines were invented to do the same purpose – to do arithmetic quickly, involving large numbers. Blaise Pascal in 1642, introduced ‘toothed wheels’ (gears), as a crude type of ‘adding machine’ and later improved versions by Baron Liebniz (1694), D. Felt (1850), W.S. Burroughs (1885), T. de Colmar (1892), Monroe and Marchant (1911), and among others* became the forerunners of today’s ‘modern adding machines’ that become useful for merchants and storekeepers. It was in 1920, when ‘electric-motor’ driven type of ‘adding machines’ became a trend that people began to appreciate their practical uses. (*source:___)
As the world of mathematics began to advance and computations no longer limited to the four fundamental arithmetic operations, pioneers of higher math (integral and differential calculus), dreamed of tools that will help them count ‘astronomical values’. The logarithmic tables and slide rules were some of the practical tools that ‘college boys’ (before the scientific calculator was invented), became acquainted. But even during the time of Charles Babbage, there were already efforts initiated, to create machines that will do analytical and differential calculations, for the same purpose of ‘giving’ instant results. It was in 1812 that Babbage started creating a ‘difference engine’, all mechanically designed, as a grant supported by the English government*, for scientific application. He even conceived new ideas for another project, the ‘analytical engine’ but both projects were turned out ‘futile’ because of the limitations in the technology at that time - his ideas were too ahead of his time that not until the 20th century that all his ideas become a reality and named him as the “Father of Modern Computer”. (Source: * www.CharlesBabbage.net)
The dream of creating a machine that can do mathematical computations ‘automatically, quickly and easy to carry’, could be the first thing that came into the minds of people who were involved in giving us the ‘hand held calculators and scientific calculators’ (Remember Sharp’s ‘Elsimate’ as among the popular brand names?). Again, it took many years to evolve. ENIAC and EDVAC were the first type of ‘room size calculators’ (as a personal opinion, rather than as computers), made up of vacuum tubes which main purpose is to compute numbers. As technology progress, the vacuum tubes were replaced by much smaller ‘transistors’ and then compiled into ICs (Integrated Components), that made it possible to become ‘pocket size’ calculators. A ‘computer processor’, in a sense, is the compilations of millions of transistors (along with other electronics components), working as calculators (or more appropriate as a large set or group of scientific calculators within a ‘single chip’), doing millions of arithmetic computations within a glimpse-fraction of time. The bottom line here is, modern computers originated initially, with the simple dream of replacing the ‘electromechanical adding machine’ by a ‘handy electronic calculator’.
BIRTH OF COMPUTER TECHNOLOGY
In the beginning, there were separate efforts in the research and developments of both telegraph and computing technologies. Specifically, the main goal of telegraph technology was to improve ways of sending ‘mails’ and ‘printed documents’ for long-distance locations, in a more economical and speedy process. Electrical signals, traveling through the cable wires, proved to be much faster than any transportation vehicle available (land, air or sea), even at this date. On the other hand, the ‘original goal’ of those pioneers of ‘computing technology’ was simply, to create a ‘smart’ machine that will help man to do the thinking for himself, regarding math problems. Never yet came into their minds that such kind of machine can be modified to do more than to automatically ‘read out’ results of mathematical calculations. Later, genius minds like John von Neumann along with his colleagues, worked out a system to establish a new way of making a machine that will truly do other tasks besides manipulating numbers. This was then, the birth of ‘Computer Science’.
If one based his knowledge about how ‘modern computers’ started (that is, the historical basis behind this topic), only from plain textbooks (as I did) – J. Aiken’s Mark I, ENIAC and EDVAC are commonly mentioned. But as I researched further this topic in the internet*, I realized that there’s indeed, a wide variety of ‘inventions’ related to a machine that ‘do the thinking’ for a certain purpose or task that human beings relied on to help accomplish things that are important at certain periods in World History. During World War II, a machine that could intercept and decode an enemy’s ‘secret coded messages’ (encrypted), became a ‘war tool’ to counteract the enemy’s moves. Germany’s encryption machine ‘Enigma’ was counter-checked by the England’s electromechanical machines ‘Bombes’, then England’s ‘Colossus’ machines were later created to intercept the more ‘sophisticated’ high level Germany’s Lorenz encryptions. Alan Turing and Thomas Flowers were few geniuses that never mentioned from ‘textbooks’ but now given recognitions (In fact, “Alan Turing” is now known as the father of Computer Science). (*source: Wikipedia’s "History of Computing Hardware")
THE PHYSICAL COMPUTER
It is true that computer was not invented by mere ‘accidental’ (something that just came into one’s mind and worked on it overnight), it was rather, ‘evolved’. The history of computer can be categorized into three different groups – according to: first, the physical innovations (progressive changes in the designs of what is now called computer ‘hardware’). Second, the technical innovations, that is, as to which technique the physical design will work (either to function in analog or digital operation). Third, the ‘programming’ innovations (either ‘controlled’ by manual operation or stored programs).
The ‘hardware’ or physical design of the early computers might have been started at the time when Charles Babbage got the idea of a pure ‘mechanical’ engine that can work out mathematical computations involving logarithmic and astronomical values. He even came up with the idea of using ‘steam power’* to turn the mechanical gears, levers, etc. of his ‘difference engine’ but didn’t came into ‘reality’ due to the reason that ‘the law of pure mechanics’ (the use only of levers, gears, etc.), limits the ability to achieve such goal. Then, the idea of combining electrical devices with mechanical parts came - the second generation of computers that was known as 'electromechanical' in nature. It was Aiken’s Mark I, along with K. Zuse's Z2 model (among others**), were considered 'electrically powered', with the introductions of electromagnetic relays, solenoids, etc. and importantly, the ‘electric motors’ to turn the gears and cams. ‘Precision-wised’, they worked but the speed of processing is a bit ‘slow’. The developments in Electronics (as a new branch of Physics at that time), brought out a promising future, as ‘electron tubes’ (or vacuum tubes), invented by Lee de Forest replaced most of the moving ‘mechanical’ parts. But at the beginning, it looked ‘disappointing’ as almost everyday, one or two vacuum tubes were needed to be replaced as they easily 'burnt out'***. Similar to the developments in the ‘Television Technology’,'Electronic Computer Technology' also started, first with the all ‘vacuum tube’ technology into Solid State, with the use of ‘transistors’, then with ICs (Integrated Components) and now, with ‘Nano-Electronics’. It seems there’s no end in making modern computer (the physical aspect of it),to become much slimmer, lighter and multi-functional****.
(Sources: * www.CharlesBabbage.net, **Wikipedia's "Electromechanical Computers", ***Wikipedia's "Eniac" and ****Wikipedia's "History of Computing Hardware")
ANALOG COMPUTER VERSUS DIGITAL COMPUTER
Analog computers are considered machines that analyze
mathematical functions and responses, proportional to how these machines behave
in actual situations. A slide rule, as a type of simple mechanical computing
device for example, responses to logarithmic functions, as determined by
numerated scale of lines, corresponding to the estimated value from a given
mathematical problem. Other more complex mechanical analog computing machines,
like the planimeter, ball and disk integrator and harmonic synthesizer for
predicting ‘tides’ were invented for the same main purpose – to help man in
doing the thinking for himself (in this case, solving mathematical problems).
Later, machines that ‘simulate’ (or to be specific, precisely and accurately
predict behaviors of certain functions), such as the flight simulators, missile
monitoring and tracking, firing-ballistic controls and other ‘prototype
models’, for purposes of studying how they react to different ‘harsh
environments and situations’ are also been classified as ‘analog computers’.
Analog computers are divided into two broad categories – for
‘general purposes’ and for ‘specific purpose’.
They are sub-divided into 1) mechanical, 2) fluid/hydraulic and 3)
electrical/electronics - according to how they are physically designed (the
technology behind ‘how their spare parts are put together’, aside from their
applications). The ‘key’ difference of analog computer from its digital
counterpart is that, it corresponds to the ‘real time and situation’ of a given
problem. It functions directly, without storing the data into a temporary
memory. It reacts immediately as to how the ‘situation adjusted’ (a slide rule,
for example, manually adjusted its position as to the given mathematical
problem, to give instant result), and could ‘estimate’ the value to a not so
‘precise’ output but in a way, ‘realistic and acceptable’ outcome.
On the other hand, a ‘digital computer’, in its early days,
had limited applications. It was designed specifically for ‘discrete elements’
such as the ‘letters’ in the alphabet, ‘Arabic numerals’ from 0 to 9, special
characters such as comma, semi-colon, etc. but not (or I might say, ‘not yet’),
for ‘non-printable or non-readable’ materials like sounds or moving pictures.
It was originally designed ‘to manipulate numbers’, formerly, in decimal form
and later into binary form. The gradual developments in technology (specially
in Electronics), did not put digital computer at its level of popularity as it
‘now enjoys’ (that is, it can manipulate multi-media contents including audio
and video), without the ‘stage-by-stage’ progress in solving the ‘technical
problems’ involved in achieving man’s goal of an ‘ultimate machine’ that is
‘smarter’ than him (but of course, with a limit that this machine will be a
‘slave’ of his own wills). Man in a way, learned how to conquer ‘time and
space’. Conquering ‘time’ by creating ‘timing circuits’, first, in kilohertz,
then in megahertz, now in hundred gigahertz and maybe, in the near future, in a
much more higher level as in terahertz (or even higher). As timing circuits
increases in speed, so do the processing of data. ‘Space’, on the other hand,
seems no longer a ‘barrier’, as Nanoelectronics began to successfully conquer
space, such as, to accommodate billions of data in a slice of ‘semiconductor
chip’, the size of an ant’s head. Compare it to the size of ENIAC which
occupies a very large ‘room’, today’s computer is ‘hundredfold’ smarter than
ENIAC and with the size that is smaller than any attaché case. Digital
Computers, in a sense, replaced the analog computers as the success in
conquering ‘time’ and ‘space’ makes it possible to overcome the limitations
that digital computers had before.
(Note: More specific discussion on ‘How Digital Computers Overcome Its Limitations’, later)
Stay Tune...More to Come!
Dagdag mo pa mga board assembly with COB...:P
ReplyDeleteAnyway dealing with SMD/SMT is just a matter of practice...I can removed 28/40 pins SOIC using ordinary soldering iron...at the same time using hot air just need a little practiced..once you get the hang of it..it simply flows...