4:38 AM | Author: apis

Lawrence "Larry" Page (born East Lansing, Michigan, 1972) is a US computer scientist best known as co-founder of Google Inc. He is ranked 26th on the 2009 Forbes list of the world’s billionaires and is the 11th richest person in America. In 2007 he and co-founder Sergey Brin were both ranked #1 of the “50 Most Important People on the Web” by PC World Magazine.

Page was born into an academically oriented family in East Lansing, Michigan. His parents were computer science professors at Michigan State University. During an interview, Page said that "their house was usually a mess, with computers and Popular Science magazines all over the place." His attraction to computers started when he was six years old when he got to "play with the stuff lying around." He became the "first kid in his elementary school to turn in an assignment from a word processor." His older brother also taught him to take things apart, and before long he was taking "everything in his house apart to see how it worked." He said,"From a very early age, I also realized I wanted to invent things. So I became really interested in technology...and business. So probably from when I was 12 I knew I was going to start a company eventually."

Page attended a Montessori school in Okemos, Michigan, and graduated from East Lansing High School (1991). Page holds a Bachelor of Science degree in computer engineering from the University of Michigan with honors and a Masters degree in Computer Science from Stanford University. While at the University of Michigan, "Page created an inkjet printer made of Lego bricks" (actually a line plotter) served as the president of the HKN, and was a member of the solar car team.

After enrolling for a Ph.D. program in computer science at Stanford University, Larry Page was in search of a dissertation theme and considered exploring the mathematical properties of the World Wide Web, understanding its link structure as a huge graph. His supervisor Terry Winograd encouraged him to pursue this idea, which Page later recalled as "the best advice I ever got".Page then focused on the problem of finding out which web pages link to a given page, considering the number and nature of such backlinks to be valuable information about that page (with the role of citations in academic publishing in mind). In his research project, nicknamed "BackRub", he was soon joined by Sergey Brin, a fellow Stanford Ph.D. student.

John Battelle, co-founder of Wired magazine, wrote of Page that he had reasoned that the "entire Web was loosely based on the premise of citation – after all, what is a link but a citation? If he could divine a method to count and qualify each backlink on the Web, as Page puts it 'the Web would become a more valuable place'."Battelle further described how Page and Brin began working together on the project:

"At the time Page conceived of BackRub, the Web comprised an estimated 10 million documents, with an untold number of links between them. The computing resources required to crawl such a beast were well beyond the usual bounds of a student project. Unaware of exactly what he was getting into, Page began building out his crawler.
"The idea's complexity and scale lured Brin to the job. A polymath who had jumped from project to project without settling on a thesis topic, he found the premise behind BackRub fascinating. "I talked to lots of research groups" around the school, Brin recalls, "and this was the most exciting project, both because it tackled the Web, which represents human knowledge, and because I liked Larry."

Brin and Page originally met in March, 1995, during a spring orientation of new computer science Ph.D. candidates. Brin, who had already been in the program for two years, was assigned to show some students, including Page, around campus, and they later became good friends.

To convert the backlink data gathered by BackRub's web crawler into a measure of importance for a given web page, Brin and Page developed the PageRank algorithm, and realized that it could be used to build a search engine far superior to existing ones.It relied on a new kind of technology which analyzed the relevance of the back links that connected one Web page to another.In August 1996, the initial version of Google was made available, still on the Stanford University Web site.

In 1998, Brin and Page founded Google, Inc.Page ran Google as co-president along with Brin until 2001 when they hired Eric Schmidt as Chairman and CEO of Google

In 2007, Page was cited by PC World as #1 on the list of the 50 most important people on the web, along with Brin and Schmidt.

John von Neumann BIO
4:26 AM | Author: apis

John von Neumann (December 28, 1903 – February 8, 1957) was an Austro-Hungarian-born American mathematician who made major contributions to a vast range of fields, including set theory, functional analysis, quantum mechanics, ergodic theory, continuous geometry, economics and game theory, computer science, numerical analysis, hydrodynamics (of explosions), and statistics, as well as many other mathematical fields. He is generally regarded as one of the foremost mathematicians of the 20th century. The mathematician Jean Dieudonné called von Neumann "the last of the great mathematicians." Even in Budapest, in the time that produced geniuses like Szilárd (1898), Wigner (1902), and Teller (1908), his brilliance stood out.

Most notably, von Neumann was a pioneer of the application of operator theory to quantum mechanics, a principal member of the Manhattan Project and the Institute for Advanced Study in Princeton (as one of the few originally appointed), and a key figure in the development of game theory and the concepts of cellular automata and the universal constructor. Along with Edward Teller and Stanislaw Ulam, von Neumann worked out key steps in the nuclear physics involved in thermonuclear reactions and the hydrogen bomb.

Von Neumann's hydrogen bomb work was also played out in the realm of computing, where he and Stanislaw Ulam developed simulations on von Neumann's digital computers for the hydrodynamic computations. During this time he contributed to the development of the Monte Carlo method, which allowed complicated problems to be approximated using random numbers. Because using lists of "truly" random numbers was extremely slow for the ENIAC, von Neumann developed a form of making pseudorandom numbers, using the middle-square method. Though this method has been criticized as crude, von Neumann was aware of this: he justified it as being faster than any other method at his disposal, and also noted that when it went awry it did so obviously, unlike methods which could be subtly incorrect.

While consulting for the Moore School of Electrical Engineering on the EDVAC project, von Neumann wrote an incomplete set of notes titled the First Draft of a Report on the EDVAC. The paper, which was widely distributed, described a computer architecture in which the data and the program are both stored in the computer's memory in the same address space. This architecture became the de facto standard until technology enabled more advanced architectures. The earliest computers were 'programmed' by altering the electronic circuitry. Although the single-memory, stored program architecture became commonly known by the name von Neumann architecture as a result of von Neumann's paper, the architecture's description was based on the work of J. Presper Eckert and John William Mauchly, inventors of the ENIAC at the University of Pennsylvania.

Von Neumann also created the field of cellular automata without the aid of computers, constructing the first self-replicating automata with pencil and graph paper. The concept of a universal constructor was fleshed out in his posthumous work Theory of Self Reproducing Automata. Von Neumann proved that the most effective way of performing large-scale mining operations such as mining an entire moon or asteroid belt would be by using self-replicating machines, taking advantage of their exponential growth.

He is credited with at least one contribution to the study of algorithms. Donald Knuth cites von Neumann as the inventor, in 1945, of the merge sort algorithm, in which the first and second halves of an array are each sorted recursively and then merged together.His algorithm for simulating a fair coin with a biased coin is used in the "software whitening" stage of some hardware random number generators.

He also engaged in exploration of problems in numerical hydrodynamics. With R. D. Richtmyer he developed an algorithm defining artificial viscosity that improved the understanding of shock waves. It is possible that we would not understand much of astrophysics, and might not have highly developed jet and rocket engines without that work. The problem was that when computers solve hydrodynamic or aerodynamic problems, they try to put too many computational grid points at regions of sharp discontinuity (shock waves). The artificial viscosity was a mathematical trick to slightly smooth the shock transition without sacrificing basic physics.

Charles Babbage BIO
4:06 AM | Author: apis
Charles Babbage, FRS (26 December 1791 – 18 October 1871)was an English mathematician, philosopher, inventor and mechanical engineer who originated the concept of a programmable computer. Parts of his uncompleted mechanisms are on display in the London Science Museum. In 1991, a perfectly functioning difference engine was constructed from Babbage's original plans. Built to tolerances achievable in the 19th century, the success of the finished engine indicated that Babbage's machine would have worked. Nine years later, the Science Museum completed the printer Babbage had designed for the difference engine, an astonishingly complex device for the 19th century. Considered a "father of the computer"Babbage is credited with inventing the first mechanical computer that eventually led to more complex designs.

His father's money allowed Charles to receive instruction from several schools and tutors during the course of his elementary education. Around the age of eight he was sent to a country school in Alphington near Exeter to recover from a life-threatening fever. His parents ordered that his "brain was not to be taxed too much" and Babbage felt that "this great idleness may have led to some of my childish reasonings." For a short time he attended King Edward VI Grammar School in Totnes, South Devon, but his health forced him back to private tutors for a time.He then joined a 30-student Holmwood academy, in Baker Street, Enfield, Middlesex under Reverend Stephen Freeman. The academy had a well-stocked library that prompted Babbage's love of mathematics. He studied with two more private tutors after leaving the academy. Of the first, a clergyman near Cambridge, Babbage said, "I fear I did not derive from it all the advantages that I might have done." The second was an Oxford tutor from whom Babbage learned enough of the Classics to be accepted to Cambridge.

Babbage arrived at Trinity College, Cambridge in October 1810. He had read extensively in Leibniz, Joseph Louis Lagrange, Thomas Simpson, and Lacroix and was seriously disappointed in the mathematical instruction available at Cambridge. In response, he, John Herschel, George Peacock, and several other friends formed the Analytical Society in 1812. Babbage, Herschel and Peacock were also close friends with future judge and patron of science Edward Ryan. Babbage and Ryan married two sisters.

In 1812 Babbage transferred to Peterhouse, CambridgeHe was the top mathematician at Peterhouse, but did not graduate with honours. He instead received an honorary degree without examination in 1814.

Babbage sought a method by which mathematical tables could be calculated mechanically, removing the high rate of human error. Three different factors seem to have influenced him: a dislike of untidiness; his experience working on logarithmic tables; and existing work on calculating machines carried out by Wilhelm Schickard, Blaise Pascal, and Gottfried Leibniz. He first discussed the principles of a calculating engine in a letter to Sir Humphry Davy in 1822.

Babbage's machines were among the first mechanical computers, although they were not actually completed, largely because of funding problems and personality issues. He directed the building of some steam-powered machines that achieved some success, suggesting that calculations could be mechanized. Although Babbage's machines were mechanical and unwieldy, their basic architecture was very similar to a modern computer. The data and program memory were separated, operation was instruction based, the control unit could make conditional jumps and the machine had a separate I/O unit.

The London Science Museum has constructed two Difference Engines, according to Babbage's plans for the Difference Engine No 2. One is owned by the museum; the other, owned by technology millionaire Nathan Myhrvold, went on exhibit at the Computer History Museum in Mountain View, California on 10 May 2008. The two models that have been constructed are not replicas; until the assembly of the first Difference Engine No 2 by the London Science Museum, no model of the Difference Engine No 2 existed.

Dr. Vinton G. Cerf BIO
3:56 AM | Author: apis
Vinton Gray "Vint" Cerf(pronounced /ˈsɜrf/; born June 23, 1943) is an American computer scientist who is the "person most often called 'the father of the Internet'." His contributions have been recognized repeatedly, with honorary degrees and awards that include the National Medal of Technology the Turing Award, and the Presidential Medal of Freedom.

In the early days, Cerf was a DOD DARPA program manager funding various groups to develop TCP/IP technology. When the Internet began to transition to a commercial opportunity, Cerf moved to MCI where he was instrumental in the development of the first commercial email system (MCI Mail) connected to the Internet.

Vinton Cerf was instrumental in the funding and formation of ICANN from the start. Cerf waited in the wings for a year before he stepped forward to join the ICANN Board. Eventually he became the Chairman of ICANN.

Cerf has worked for Google as its Vice President and Chief Internet Evangelist since September 2005. In this function he has become well known for his predictions on how technology will affect future society, encompassing such areas as artificial intelligence, environmentalism, the advent of IPv6 and the transformation of the television industry and its delivery model.

Cerf also went to the same high school as Jon Postel and Steve Crocker; he wrote the former's obituary. Both are also people who were instrumental in the creation of the internet as we know it

Cerf's first job after obtaining his B.S. in Mathematics from Stanford University was at IBM, where he worked for less than two years as a systems engineer supporting QUIKTRAN. He left IBM to attend graduate school at UCLA where he earned his master's degree in 1970 and his PhD degree in 1972.During his graduate student years, he studied under Professor Gerald Estrin, worked in Professor Leonard Kleinrock's data packet networking group that connected the first two nodes of the ARPANet,[10] the predecessor to the Internet, and "contributed to a host-to-host protocol" for the ARPANet. While at UCLA, he also met Robert E. Kahn, who was working on the ARPANet hardware architecture. After receiving his doctorate, Cerf became an assistant professor at Stanford University from 1972-1976, where he "conducted research on packet network interconnection protocols and co-designed the DoD TCP/IP protocol suite with Kahn.

Cerf then moved to DARPA in 1976, where he stayed until 1982.

As vice president of MCI Digital Information Services from 1982-1986, Cerf led the engineering of MCI Mail, the first commercial email service to be connected to the Internet. Cerf rejoined MCI during 1994 and served as Senior Vice President of Technology Strategy. In this role, he helped to guide corporate strategy development from a technical perspective. Previously, he served as MCI's senior vice president of Architecture and Technology, leading a team of architects and engineers to design advanced networking frameworks, including Internet-based solutions for delivering a combination of data, information, voice and video services for business and consumer use.

During 1997, Cerf joined the Board of Trustees of Gallaudet University, a university for the education of the deaf and hard-of-hearing.Cerf is hard of hearing.

Cerf joined the board of the Internet Corporation for Assigned Names and Numbers (ICANN) in 1999, and served until the end of 2007.

Cerf is a member of the Bulgarian President Georgi Parvanov's IT Advisory Council, a group created by Presidential Decree on March 8, 2002. He is also a member of the Advisory Board of Eurasia Group, the political risk consultancy.

Cerf is also working on the Interplanetary Internet, together with NASA's Jet Propulsion Laboratory. It will be a new standard to communicate from planet to planet, using radio/laser communications that are tolerant of signal degradation.

During February 2006, Cerf testified before the U.S. Senate Committee on Commerce, Science, and Transportation's Hearing on “Network Neutrality”.

Cerf currently serves on the board of advisors of Scientists and Engineers for America, an organization focused on promoting sound science in American government.

Cerf is on the board of advisors of The Hyperwords Company Ltd of the UK, which works to make the web more usefully interactive and which has produced the free Firefox Add-On called 'Hyperwords'.

During 2008 Cerf chaired the IDNAbis working group of the IETF.

Cerf was a major contender to be designated the nation's first Chief Technology Officer by President Barack Obama

3:40 AM | Author: apis

William Henry "Bill" Gates III (born October 28, 1955)is an American business magnate, philanthropist, and chairman of Microsoft, the software company he founded with Paul Allen. He is ranked consistently one of the world's wealthiest people and the wealthiest overall as of 2009. During his career at Microsoft, Gates held the positions of CEO and chief software architect, and remains the largest individual shareholder with more than 8 percent of the common stock. He has also authored or co-authored several books.

Gates is one of the best-known entrepreneurs of the personal computer revolution. Although he is admired by many, a number of industry insiders criticize his business tactics, which they consider anti-competitive, an opinion which has in some cases been upheld by the courts (see Criticism of Microsoft) In the later stages of his career, Gates has pursued a number of philanthropic endeavors, donating large amounts of money to various charitable organizations and scientific research programs through the Bill & Melinda Gates Foundation, established in 2000.

Bill Gates stepped down as chief executive officer of Microsoft in January, 2000. He remained as chairman and created the position of chief software architect. In June, 2006, Gates announced that he would be transitioning from full-time work at Microsoft to part-time work and full-time work at the Bill & Melinda Gates Foundation. He gradually transferred his duties to Ray Ozzie, chief software architect and Craig Mundie, chief research and strategy officer. Gates' last full-time day at Microsoft was June 27, 2008. He remains at Microsoft as non-executive chairman.

After reading the January 1975 issue of Popular Electronics that demonstrated the Altair 8800, Gates contacted Micro Instrumentation and Telemetry Systems (MITS), the creators of the new microcomputer, to inform them that he and others were working on a BASIC interpreter for the platform.In reality, Gates and Allen did not have an Altair and had not written code for it; they merely wanted to gauge MITS's interest. MITS president Ed Roberts agreed to meet them for a demo, and over the course of a few weeks they developed an Altair emulator that ran on a minicomputer, and then the BASIC interpreter. The demonstration, held at MITS's offices in Albuquerque, was a success and resulted in a deal with MITS to distribute the interpreter as Altair BASIC. Paul Allen was hired into MITS,and Gates took a leave of absence from Harvard to work with Allen at MITS in Albuquerque in November 1975. They named their partnership "Micro-Soft" and had their first office located in Albuquerque. Within a year, the hyphen was dropped, and on November 26, 1976, the trade name "Microsoft" was registered with the Office of the Secretary of the State of New Mexico. Gates never returned to Harvard to complete his studies.

Microsoft's BASIC was popular with computer hobbyists, but Gates discovered that a pre-market copy had leaked into the community and was being widely copied and distributed. In February 1976, Gates wrote an Open Letter to Hobbyists in the MITS newsletter saying that MITS could not continue to produce, distribute, and maintain high-quality software without payment.This letter was unpopular with many computer hobbyists, but Gates persisted in his belief that software developers should be able to demand payment. Microsoft became independent of MITS in late 1976, and it continued to develop programming language software for various systems.The company moved from Albuquerque to its new home in Bellevue, Washington on January 1, 1979.

During Microsoft's early years, all employees had broad responsibility for the company's business. Gates oversaw the business details, but continued to write code as well. In the first five years, he personally reviewed every line of code the company shipped, and often rewrote parts of it as he saw fit.

In 1980, IBM approached Microsoft to write the BASIC interpreter for its upcoming personal computer, the IBM PC. When IBM's representatives mentioned that they needed an operating system, Gates referred them to Digital Research (DRI), makers of the widely used CP/M operating systemIBM's discussions with Digital Research went poorly, and they did not reach a licensing agreement. IBM representative Jack Sams mentioned the licensing difficulties during a subsequent meeting with Gates and told him to get an acceptable operating system. A few weeks later Gates proposed using 86-DOS (QDOS), an operating system similar to CP/M that Tim Paterson of Seattle Computer Products (SCP) had made for hardware similar to the PC. Microsoft made a deal with SCP to become the exclusive licensing agent, and later the full owner, of 86-DOS. After adapting the operating system for the PC, Microsoft delivered it to IBM as PC-DOS in exchange for a one-time fee of $50,000. Gates did not offer to transfer the copyright on the operating system, because he believed that other hardware vendors would clone IBM's system. They did, and the sales of MS-DOS made Microsoft a major player in the industry

Gates oversaw Microsoft's company restructuring on June 25, 1981, which re-incorporated the company in Washington and made Gates President of Microsoft and the Chairman of the Board. Microsoft launched its first retail version of Microsoft Windows on November 20, 1985, and in August, the company struck a deal with IBM to develop a separate operating system called OS/2. Although the two companies successfully developed the first version of the new system, mounting creative differences undermined the partnership. Gates distributed an internal memo on May 16, 1991, announcing that the OS/2 partnership was over and Microsoft would shift its efforts to the Windows NT kernel development.

Tim Berners-Lee BIO
2:22 AM | Author: apis

Sir Timothy John "Tim" Berners-Lee, OM, KBE, FRS, FREng, FRSA (born 8 June 1955), is a British engineer and computer scientist and MIT professor credited with inventing the World Wide Web, making the first proposal for it in March 1989. On 25 December 1990, with the help of Robert Cailliau and a young student staff at CERN, he implemented the first successful communication between an HTTP client and server via the Internet. In 1999, Time Magazine named Berners-Lee one of the 100 Most Important People of the 20th Century.In 2007, he was ranked Joint First, alongside Albert Hofmann, in The Telegraph's list of 100 greatest living geniuses. Berners-Lee is the director of the World Wide Web Consortium (W3C), which oversees the Web's continued development. He is also the founder of the World Wide Web Foundation, and is a senior researcher and holder of the 3Com Founders Chair at the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). He is a director of The Web Science Research Initiative (WSRI), and a member of the advisory board of the MIT Center for Collective Intelligence. In April 2009, he was elected as a member of the United States National Academy of Sciences, based in Washington, D.C.

While an independent contractor at CERN from June to December 1980, Berners-Lee proposed a project based on the concept of hypertext, to facilitate sharing and updating information among researchers. While there, he built a prototype system named ENQUIRE. After leaving CERN in 1980, he went to work at John Poole's Image Computer Systems, Ltd, in Bournemouth, England, but returned to CERN in 1984 as a fellow. In 1989, CERN was the largest Internet node in Europe, and Berners-Lee saw an opportunity to join hypertext with the Internet: "I just had to take the hypertext idea and connect it to the Transmission Control Protocol and domain name system ideas and — ta-da! — the World Wide Web."[11] He wrote his initial proposal in March 1989, and in 1990, with the help of Robert Cailliau, produced a revision which was accepted by his manager, Mike Sendall. He used similar ideas to those underlying the Enquire system to create the World Wide Web, for which he designed and built the first Web browser, which also functioned as an editor (WorldWideWeb, running on the NeXTSTEP operating system), and the first Web server, CERN HTTPd (short for HyperText Transfer Protocol daemon).

The first Web site built was at CERN, and was first put on line on 6 August 1991. " was the address of the world's first-ever web site and web server, running on a NeXT computer at CERN. The first web page address was, which centred on information regarding the WWW project. Visitors could learn more about hypertext, technical details for creating their own webpage, and even an explanation on how to search the Web for information. There are no screenshots of this original page and, in any case, changes were made daily to the information available on the page as the WWW project developed. You may find a later copy (1992) on the World Wide Web Consortium website." -CERN It provided an explanation of what the World Wide Web was, and how one could use a browser and set up a Web server.

In 1994, Berners-Lee founded the World Wide Web Consortium (W3C) at MIT. It comprised various companies that were willing to create standards and recommendations to improve the quality of the Web. Berners-Lee made his idea available freely, with no patent and no royalties due. The World Wide Web Consortium decided that its standards should be based on royalty-free technology, so that they could easily be adopted by anyone.

In 2001, Berners-Lee became a patron of the East Dorset Heritage Trust, having previously lived in Colehill in Wimborne, East Dorset, England.

In December 2004, he accepted a chair in Computer Science at the School of Electronics and Computer Science, University of Southampton, England, to work on his new project, the Semantic Web.[17]

In June 2009 Prime Minister Gordon Brown announced Berners-Lee will work with the UK Government to help make data more open and accessible on the Web, building on the work of the Power of Information Task Force.

He was also one of the pioneer voices in favour of Net Neutrality,[19] and has expressed the view that ISPs should supply "connectivity with no strings attached," and should neither control nor monitor customers' browsing activities without their express consent.

In a Times article in October 2009, Berners-Lee admitted that the forward slashes ("//") in a web address were actually "unnecessary". He told the newspaper that he could easily have designed URLs not to have the forward slashes. "There you go, it seemed like a good idea at the time," he said in his lighthearted apology.

In November 2009, Berners-Lee launched the World Wide Web Foundation in order to "Advance the Web to empower humanity by launching transformative programs that build local capacity to leverage the Web as a medium for positive change."

2:11 AM | Author: apis

Steven Paul "Steve" Jobs (born February 24, 1955) is an American businessman, and the co-founder and chief executive officer of Apple Inc. Jobs previously served as CEO of Pixar Animation Studios.

In the late 1970s, Jobs, with Apple co-founder Steve Wozniak, created one of the first commercially successful personal computers. In the early 1980s, Jobs was among the first to see the commercial potential of the mouse-driven graphical user interface. After losing a power struggle with the board of directors in 1985, Jobs resigned from Apple and founded NeXT, a computer platform development company specializing in the higher education and business markets. NeXT's subsequent 1997 buyout by Apple Computer Inc. brought Jobs back to the company he co-founded, and he has served as its CEO since then. Steve Jobs was listed as Fortune Magazine's Most Powerful Businessman of 2007. In 2009 he is ranked #57 on Forbes:The World's Most Powerful People.

In 1986, he acquired the computer graphics division of Lucasfilm Ltd which was spun off as Pixar Animation Studios. He remained CEO and majority shareholder until its acquisition by the Walt Disney Company in 2006. Jobs is currently a member of Walt Disney Company's Board of Directors.

Jobs' history in business has contributed greatly to the myths of the idiosyncratic, individualistic Silicon Valley entrepreneur, emphasizing the importance of design and understanding the crucial role aesthetics play in public appeal. His work driving forward the development of products that are both functional and elegant has earned him a devoted following.[17]

In mid-January 2009, Jobs took a 5 month leave of absence from Apple to undergo a liver transplant.

Beginnings of Apple Computer

In 1976, Steve Jobs and Stephen Wozniak, with funding from multimillionaire A.C. "Mike" Markkula, founded Apple. Prior to co-founding Apple, Wozniak was an electronics hacker. Jobs and Wozniak had been friends for several years, having met in 1971, when their mutual friend, Bill Fernandez, introduced 21-year-old Wozniak to 16-year-old Jobs. Steve Jobs managed to interest Wozniak in assembling a computer and selling it. As Apple continued to expand, the company began looking for an experienced executive to help manage its expansion. In 1983, Steve Jobs lured John Sculley away from Pepsi-Cola to serve as Apple's CEO, asking, "Do you want to spend the rest of your life selling sugared water to children, or do you want a chance to change the world?"The following year, Apple set out to do just that, starting with a Super Bowl television commercial titled, "1984." At Apple's annual shareholders meeting on January 24, 1984, an emotional Jobs introduced the Macintosh to a wildly enthusiastic audience; Andy Hertzfeld described the scene as "pandemonium." The Macintosh became the first commercially successful small computer with a graphical user interface. The development of the Mac was started by Jef Raskin, and eventually taken over by Jobs.

While Jobs was a persuasive and charismatic director for Apple, some of his employees from that time had described him as an erratic and temperamental manager. An industry-wide sales slump towards the end of 1984 caused a deterioration in Jobs's working relationship with Sculley, and at the end of May 1985 – following an internal power struggle and an announcement of significant layoffs – Sculley relieved Jobs of his duties as head of the Macintosh division.

Here we see Steve Jobs introducing the very first iPod at a low key event in 2001
Guido van Rossum BIO
1:56 AM | Author: apis


Van Rossum was born and grew up in the Netherlands, where he received a masters degree in mathematics and computer science from the University of Amsterdam in 1982. He later worked for various research institutes, including the Dutch National Research Institute for Mathematics and Computer Science (CWI), Amsterdam, the National Institute of Standards and Technology (NIST), Gaithersburg, Maryland, and the Corporation for National Research Initiatives (CNRI), Reston, Virginia.

In December 2005, Van Rossum was hired by Google. He wrote a web based code review tool for Google in Python.

Van Rossum received the 2001 Award for the Advancement of Free Software from the Free Software Foundation (FSF) at the 2002 FOSDEM conference in Brussels, Belgium. Guido received a NLUUG Award in May 2003. In 2006 he was recognized as a Distinguished Engineer by the Association for Computing Machinery.

Personal life

Guido van Rossum is the brother of Just van Rossum, a type designer and also a programmer. Just van Rossum designed the font that is used in the "Python Powered" logo. Currently Guido lives in California together with his American wife Kim Knapp and their son Orlijn.


While working at the Stichting Mathematisch Centrum (CWI), Guido van Rossum wrote and contributed a glob() routine to BSD Unix in 1986. Van Rossum also worked on the development of the ABC programming language.


About the origin of Python, Van Rossum wrote in 1996:

Over six years ago, in December 1989, I was looking for a "hobby" programming project that would keep me occupied during the week around Christmas. My office … would be closed, but I had a home computer, and not much else on my hands. I decided to write an interpreter for the new scripting language I had been thinking about lately: a descendant of ABC that would appeal to Unix/C hackers. I chose Python as a working title for the project, being in a slightly irreverent mood (and a big fan of Monty Python's Flying Circus).

1:47 AM | Author: apis
Alan Cox (born July 22, 1968 in Solihull, England) is a British computer programmer heavily involved in the development of the Linux kernel since its early days in 1991. He lives in Swansea, Wales with his wife, Telsa Gwynne.

While employed on the campus of Swansea University, he installed a very early version of Linux on one of the machines belonging to the university computer society. This was one of the first Linux installations on a busy network, and revealed many bugs in the networking code. Cox fixed many of these bugs, and went on to rewrite much of the networking subsystem. He then became one of the main developers and maintainers of the whole kernel.

He maintained the 2.2 branch, and his own versions of the 2.4 branch (signified by an "ac" in the version, for example 2.4.13-ac1). This branch was very stable and contained bugfixes that went directly into the vendor kernels. He was once commonly regarded as being the "second in command" after Linus Torvalds himself, before reducing his involvement with Linux to study for an MBA. However, on July 28 2009, Cox walked away from the TTY layer, which he still maintained, after receiving criticism from Torvalds.

Alan was employed by Linux distributor Red Hat for ten years, leaving in January 2009. He is now employed by Intel.

He has also been involved in the GNOME and X.Org projects, and was the main developer of AberMUD, which he wrote whilst a student at the University of Wales, Aberystwyth.

Alan Cox is an ardent supporter of programming freedom, and an outspoken opponent of software patents, the DMCA and the CBDTPA. He resigned from a subgroup of Usenix in protest, and said he would not visit the United States for fear of being imprisoned after the arrest of Dmitry Sklyarov for DMCA violations.

In January 2007, he applied for a series of patents on "RMS", or Rights Management Systems. It is said that he has filed a patent for Digital Rights Management. Red Hat Inc., Cox's former employer, has stated (in a document drafted by Mark Webbink and Cox himself) that it will not use patents against free software projects.

Cox is also an adviser to the Foundation for Information Policy Research and the Open Rights Group.

12:53 AM | Author: apis

Early years

Linus Torvalds was born in Helsinki, Finland, the son of journalists Anna and Nils Torvalds, and the grandson of poet Ole Torvalds. Both of his parents were campus radicals at the University of Helsinki in the 1960s. His family belongs to the Swedish-speaking minority (5.5%) of Finland's population. Torvalds was named after Linus Pauling, the American Nobel Prize-winning chemist, although in the book Rebel Code: Linux and the Open Source Revolution, Torvalds is quoted as saying, "I think I was named equally for Linus the Peanuts cartoon character," noting that this makes him half "Nobel-prize-winning chemist" and half "blanket-carrying cartoon character".

Torvalds attended the University of Helsinki from 1988 to 1996, graduating with a master's degree in computer science from NODES research group. His academic career was interrupted after his first year of study when he joined the Finnish Army, selecting the 11-month officer training program to fulfill the mandatory military service of Finland. In the army he held the rank of second lieutenant, with the role of fire controller, calculating positions of guns, targets, and trajectories, finally telling the guns where to shoot. In 1990, he resumed his university studies, and was exposed to UNIX for the first time, in the form of a DEC MicroVAX running ULTRIX. His M.Sc. thesis was titled Linux: A Portable Operating System.

His interest in computers began with a Commodore VIC-20. After the VIC-20 he purchased a Sinclair QL which he modified extensively, especially its operating system. He programmed an assembly language and a text editor for the QL, as well as a few games.He is known to have written a Pac-Man clone named Cool Man. On January 2, 1991 he purchased an Intel 80386-based IBM PC and spent a month playing the game Prince of Persia before receiving his MINIX copy which in turn enabled him to begin his work on Linux.

Later years

After a visit to Transmeta in late 1996, he accepted a position at the company in California, where he would work from February 1997 through June 2003. He then moved to the Open Source Development Labs, which has since merged with the Free Standards Group to become the Linux Foundation, under whose auspices he continues to work. In June 2004, Torvalds and his family moved to Portland, Oregon to be closer to the OSDL's Beaverton, Oregon-based headquarters.

From 1997 to 1999 he was involved in 86open helping to choose the standard binary format for Linux and Unix.

Red Hat and VA Linux, both leading developers of Linux-based software, presented Torvalds with stock options in gratitude for his creation. In 1999, both companies went public and Torvalds' net worth shot up to roughly $20 million.

His personal mascot is a penguin nicknamed Tux, which has been widely adopted by the Linux community as the mascot of the Linux kernel.

Although Torvalds believes that "open source is the only right way to do software", he also has said that he uses the "best tool for the job", even if that includes proprietary software.[14] He has been criticized for his use and alleged advocacy of the proprietary BitKeeper software for version control in the Linux kernel. However, Torvalds has since written a free-software replacement for BitKeeper called Git. Torvalds has commented on official GNOME developmental mailing lists that, in terms of desktop environments, he encourages users to switch to KDE However, Torvalds thought KDE 4.0 was a "disaster" because of its lack of maturity, so he temporarily switched to GNOME.