Jump to content

Computing

From Wikipedia, the free encyclopedia
(Redirected from Compute)

Computer simulation
Computer simulation, one of the main cross-computing methodologies

Computing is any goal-oriented activity requiring, benefiting from, or creating computing machinery.[1] It includes the study and experimentation of algorithmic processes, and the development of both hardware and software. Computing has scientific, engineering, mathematical, technological, and social aspects. Major computing disciplines include computer engineering, computer science, cybersecurity, data science, information systems, information technology, and software engineering.[2]

The term computing is also synonymous with counting and calculating. In earlier times, it was used in reference to the action performed by mechanical computing machines, and before that, to human computers.[3]

Early vacuum tube Turing complete computer
ENIAC, the first programmable general-purpose electronic digital computer

History

[edit]

The history of computing is longer than the history of computing hardware and includes the history of methods intended for pen and paper (or for chalk and slate) with or without the aid of tables. Computing is intimately tied to the representation of numbers, though mathematical concepts necessary for computing existed before numeral systems. The earliest known tool for use in computation is the abacus, and it is thought to have been invented in Babylon circa between 2700 and 2300 BC. Abaci, of a more modern design, are still used as calculation tools today.

The first recorded proposal for using digital electronics in computing was the 1931 paper "The Use of Thyratrons for High Speed Automatic Counting of Physical Phenomena" by C. E. Wynn-Williams.[4] Claude Shannon's 1938 paper "A Symbolic Analysis of Relay and Switching Circuits" then introduced the idea of using electronics for Boolean algebraic operations.

The concept of a field-effect transistor was proposed by Julius Edgar Lilienfeld in 1925. John Bardeen and Walter Brattain, while working under William Shockley at Bell Labs, built the first working transistor, the point-contact transistor, in 1947.[5][6] In 1953, the University of Manchester built the first transistorized computer, the Manchester Baby.[7] However, early junction transistors were relatively bulky devices that were difficult to mass-produce, which limited them to a number of specialised applications.[8]

In 1957, Frosch and Derick were able to manufacture the first silicon dioxide field effect transistors at Bell Labs, the first transistors in which drain and source were adjacent at the surface.[9] Subsequently, a team demonstrated a working MOSFET at Bell Labs 1960.[10][11] The MOSFET made it possible to build high-density integrated circuits,[12][13] leading to what is known as the computer revolution[14] or microcomputer revolution.[15]

Computer

[edit]

A computer is a machine that manipulates data according to a set of instructions called a computer program.[16] The program has an executable form that the computer can use directly to execute the instructions. The same program in its human-readable source code form, enables a programmer to study and develop a sequence of steps known as an algorithm.[17] Because the instructions can be carried out in different types of computers, a single set of source instructions converts to machine instructions according to the CPU type.[18]

The execution process carries out the instructions in a computer program. Instructions express the computations performed by the computer. They trigger sequences of simple actions on the executing machine. Those actions produce effects according to the semantics of the instructions.

Computer hardware

[edit]

Computer hardware includes the physical parts of a computer, including the central processing unit, memory, and input/output.[19] Computational logic and computer architecture are key topics in the field of computer hardware.[20][21]

Computer software

[edit]

Computer software, or just software, is a collection of computer programs and related data, which provides instructions to a computer. Software refers to one or more computer programs and data held in the storage of the computer. It is a set of programs, procedures, algorithms, as well as its documentation concerned with the operation of a data processing system.[citation needed] Program software performs the function of the program it implements, either by directly providing instructions to the computer hardware or by serving as input to another piece of software. The term was coined to contrast with the old term hardware (meaning physical devices). In contrast to hardware, software is intangible.[22]

Software is also sometimes used in a more narrow sense, meaning application software only.

System software

[edit]

System software, or systems software, is computer software designed to operate and control computer hardware, and to provide a platform for running application software. System software includes operating systems, utility software, device drivers, window systems, and firmware. Frequently used development tools such as compilers, linkers, and debuggers are classified as system software.[23] System software and middleware manage and integrate a computer's capabilities, but typically do not directly apply them in the performance of tasks that benefit the user, unlike application software.

Application software

[edit]

Application software, also known as an application or an app, is computer software designed to help the user perform specific tasks. Examples include enterprise software, accounting software, office suites, graphics software, and media players. Many application programs deal principally with documents.[24] Apps may be bundled with the computer and its system software, or may be published separately. Some users are satisfied with the bundled apps and need never install additional applications. The system software manages the hardware and serves the application, which in turn serves the user.

Application software applies the power of a particular computing platform or system software to a particular purpose. Some apps, such as Microsoft Office, are developed in multiple versions for several different platforms; others have narrower requirements and are generally referred to by the platform they run on. For example, a geography application for Windows or an Android application for education or Linux gaming. Applications that run only on one platform and increase the desirability of that platform due to the popularity of the application, known as killer applications.[25]

Computer network

[edit]

A computer network, often simply referred to as a network, is a collection of hardware components and computers interconnected by communication channels that allow the sharing of resources and information.[26] When at least one process in one device is able to send or receive data to or from at least one process residing in a remote device, the two devices are said to be in a network. Networks may be classified according to a wide variety of characteristics such as the medium used to transport the data, communications protocol used, scale, topology, and organizational scope.

Communications protocols define the rules and data formats for exchanging information in a computer network, and provide the basis for network programming. One well-known communications protocol is Ethernet, a hardware and link layer standard that is ubiquitous in local area networks. Another common protocol is the Internet Protocol Suite, which defines a set of protocols for internetworking, i.e. for data communication between multiple networks, host-to-host data transfer, and application-specific data transmission formats.[27]

Computer networking is sometimes considered a sub-discipline of electrical engineering, telecommunications, computer science, information technology, or computer engineering, since it relies upon the theoretical and practical application of these disciplines.[28]

Internet

[edit]

The Internet is a global system of interconnected computer networks that use the standard Internet Protocol Suite (TCP/IP) to serve billions of users. This includes millions of private, public, academic, business, and government networks, ranging in scope from local to global. These networks are linked by a broad array of electronic, wireless, and optical networking technologies. The Internet carries an extensive range of information resources and services, such as the inter-linked hypertext documents of the World Wide Web and the infrastructure to support email.[29]

Computer programming

[edit]

Computer programming is the process of writing, testing, debugging, and maintaining the source code and documentation of computer programs. This source code is written in a programming language, which is an artificial language that is often more restrictive than natural languages, but easily translated by the computer. Programming is used to invoke some desired behavior (customization) from the machine.[30]

Writing high-quality source code requires knowledge of both the computer science domain and the domain in which the application will be used. The highest-quality software is thus often developed by a team of domain experts, each a specialist in some area of development.[31] However, the term programmer may apply to a range of program quality, from hacker to open source contributor to professional. It is also possible for a single programmer to do most or all of the computer programming needed to generate the proof of concept to launch a new killer application.[32]

Computer programmer

[edit]

A programmer, computer programmer, or coder is a person who writes computer software. The term computer programmer can refer to a specialist in one area of computer programming or to a generalist who writes code for many kinds of software. One who practices or professes a formal approach to programming may also be known as a programmer analyst.[citation needed] A programmer's primary computer language (C, C++, Java, Lisp, Python, etc.) is often prefixed to the above titles, and those who work in a web environment often prefix their titles with Web. The term programmer can be used to refer to a software developer, software engineer, computer scientist, or software analyst. However, members of these professions typically possess other software engineering skills, beyond programming.[33]

Computer industry

[edit]

The computer industry is made up of businesses involved in developing computer software, designing computer hardware and computer networking infrastructures, manufacturing computer components, and providing information technology services, including system administration and maintenance.[citation needed]

The software industry includes businesses engaged in development, maintenance, and publication of software. The industry also includes software services, such as training, documentation, and consulting.[citation needed]

Sub-disciplines of computing

[edit]

Computer engineering

[edit]

Computer engineering is a discipline that integrates several fields of electrical engineering and computer science required to develop computer hardware and software.[34] Computer engineers usually have training in electronic engineering (or electrical engineering), software design, and hardware-software integration, rather than just software engineering or electronic engineering. Computer engineers are involved in many hardware and software aspects of computing, from the design of individual microprocessors, personal computers, and supercomputers, to circuit design. This field of engineering includes not only the design of hardware within its own domain, but also the interactions between hardware and the context in which it operates.[35]

Software engineering

[edit]

Software engineering is the application of a systematic, disciplined, and quantifiable approach to the design, development, operation, and maintenance of software, and the study of these approaches. That is, the application of engineering to software.[36][37][38] It is the act of using insights to conceive, model and scale a solution to a problem. The first reference to the term is the 1968 NATO Software Engineering Conference, and was intended to provoke thought regarding the perceived software crisis at the time.[39][40][41] Software development, a widely used and more generic term, does not necessarily subsume the engineering paradigm. The generally accepted concepts of Software Engineering as an engineering discipline have been specified in the Guide to the Software Engineering Body of Knowledge (SWEBOK). The SWEBOK has become an internationally accepted standard in ISO/IEC TR 19759:2015.[42]

Computer science

[edit]

Computer science or computing science (abbreviated CS or Comp Sci) is the scientific and practical approach to computation and its applications. A computer scientist specializes in the theory of computation and the design of computational systems.[43]

Its subfields can be divided into practical techniques for its implementation and application in computer systems, and purely theoretical areas. Some, such as computational complexity theory, which studies fundamental properties of computational problems, are highly abstract, while others, such as computer graphics, emphasize real-world applications. Others focus on the challenges in implementing computations. For example, programming language theory studies approaches to the description of computations, while the study of computer programming investigates the use of programming languages and complex systems. The field of human–computer interaction focuses on the challenges in making computers and computations useful, usable, and universally accessible to humans. [44]

Cybersecurity

[edit]

The field of cybersecurity pertains to the protection of computer systems and networks. This includes information and data privacy, preventing disruption of IT services and prevention of theft of and damage to hardware, software, and data.[45]

Data science

[edit]

Data science is a field that uses scientific and computing tools to extract information and insights from data, driven by the increasing volume and availability of data.[46] Data mining, big data, statistics, machine learning and deep learning are all interwoven with data science.[47]

Information systems

[edit]

Information systems (IS) is the study of complementary networks of hardware and software (see information technology) that people and organizations use to collect, filter, process, create, and distribute data.[48][49][50] The ACM's Computing Careers describes IS as:

"A majority of IS [degree] programs are located in business schools; however, they may have different names such as management information systems, computer information systems, or business information systems. All IS degrees combine business and computing topics, but the emphasis between technical and organizational issues varies among programs. For example, programs differ substantially in the amount of programming required."[51]

The study of IS bridges business and computer science, using the theoretical foundations of information and computation to study various business models and related algorithmic processes within a computer science discipline.[52][53][54] The field of Computer Information Systems (CIS) studies computers and algorithmic processes, including their principles, their software and hardware designs, their applications, and their impact on society[55][56] while IS emphasizes functionality over design.[57]

Information technology

[edit]

Information technology (IT) is the application of computers and telecommunications equipment to store, retrieve, transmit, and manipulate data,[58] often in the context of a business or other enterprise.[59] The term is commonly used as a synonym for computers and computer networks, but also encompasses other information distribution technologies such as television and telephones. Several industries are associated with information technology, including computer hardware, software, electronics, semiconductors, internet, telecom equipment, e-commerce, and computer services.[60][61]

Research and emerging technologies

[edit]

DNA-based computing and quantum computing are areas of active research for both computing hardware and software, such as the development of quantum algorithms. Potential infrastructure for future technologies includes DNA origami on photolithography[62] and quantum antennae for transferring information between ion traps.[63] By 2011, researchers had entangled 14 qubits.[64][65] Fast digital circuits, including those based on Josephson junctions and rapid single flux quantum technology, are becoming more nearly realizable with the discovery of nanoscale superconductors.[66]

Fiber-optic and photonic (optical) devices, which already have been used to transport data over long distances, are starting to be used by data centers, along with CPU and semiconductor memory components. This allows the separation of RAM from CPU by optical interconnects.[67] IBM has created an integrated circuit with both electronic and optical information processing in one chip. This is denoted CMOS-integrated nanophotonics (CINP).[68] One benefit of optical interconnects is that motherboards, which formerly required a certain kind of system on a chip (SoC), can now move formerly dedicated memory and network controllers off the motherboards, spreading the controllers out onto the rack. This allows standardization of backplane interconnects and motherboards for multiple types of SoCs, which allows more timely upgrades of CPUs.[69]

Another field of research is spintronics. Spintronics can provide computing power and storage, without heat buildup.[70] Some research is being done on hybrid chips, which combine photonics and spintronics.[71][72] There is also research ongoing on combining plasmonics, photonics, and electronics.[73]

Cloud computing

[edit]

Cloud computing is a model that allows for the use of computing resources, such as servers or applications, without the need for interaction between the owner of these resources and the end user. It is typically offered as a service, making it an example of Software as a Service, Platforms as a Service, and Infrastructure as a Service, depending on the functionality offered. Key characteristics include on-demand access, broad network access, and the capability of rapid scaling.[74] It allows individual users or small business to benefit from economies of scale.

One area of interest in this field is its potential to support energy efficiency. Allowing thousands of instances of computation to occur on one single machine instead of thousands of individual machines could help save energy. It could also ease the transition to renewable energy source, since it would suffice to power one server farm with renewable energy, rather than millions of homes and offices.[75]

However, this centralized computing model poses several challenges, especially in security and privacy. Current legislation does not sufficiently protect users from companies mishandling their data on company servers. This suggests potential for further legislative regulations on cloud computing and tech companies.[76]

Quantum computing

[edit]

Quantum computing is an area of research that brings together the disciplines of computer science, information theory, and quantum physics. While the idea of information as part of physics is relatively new, there appears to be a strong tie between information theory and quantum mechanics.[77] Whereas traditional computing operates on a binary system of ones and zeros, quantum computing uses qubits. Qubits are capable of being in a superposition, i.e. in both states of one and zero, simultaneously. Thus, the value of the qubit is not between 1 and 0, but changes depending on when it is measured. This trait of qubits is known as quantum entanglement, and is the core idea of quantum computing that allows quantum computers to do large scale computations.[78] Quantum computing is often used for scientific research in cases where traditional computers do not have the computing power to do the necessary calculations, such in molecular modeling. Large molecules and their reactions are far too complex for traditional computers to calculate, but the computational power of quantum computers could provide a tool to perform such calculations.[79]

See also

[edit]

References

[edit]
  1. ^ "Computing Classification System". Digital Library. Association for Computing Machinery.
  2. ^ "Computing Careers & Disciplines: A Quick Guide for Prospective Students and Career Advisors (2nd edition, ©2020)". CERIC. 17 January 2020. Retrieved 4 July 2022.
  3. ^ "The History of Computing". mason.gmu.edu. Retrieved 12 April 2019.
  4. ^ Wynn-Williams, C. E. (2 July 1931), "The Use of Thyratrons for High Speed Automatic Counting of Physical Phenomena", Proceedings of the Royal Society A, 132 (819): 295–310, Bibcode:1931RSPSA.132..295W, doi:10.1098/rspa.1931.0102
  5. ^ Lee, Thomas H. (2003). The Design of CMOS Radio-Frequency Integrated Circuits (PDF). Cambridge University Press. ISBN 9781139643771. Archived from the original (PDF) on 9 December 2019. Retrieved 16 September 2019.
  6. ^ Puers, Robert; Baldi, Livio; Voorde, Marcel Van de; Nooten, Sebastiaan E. van (2017). Nanoelectronics: Materials, Devices, Applications, 2 Volumes. John Wiley & Sons. p. 14. ISBN 9783527340538.
  7. ^ Lavington, Simon (1998), A History of Manchester Computers (2 ed.), Swindon: The British Computer Society, pp. 34–35
  8. ^ Moskowitz, Sanford L. (2016). Advanced Materials Innovation: Managing Global Technology in the 21st century. John Wiley & Sons. pp. 165–167. ISBN 9780470508923.
  9. ^ Frosch, C. J.; Derick, L (1957). "Surface Protection and Selective Masking during Diffusion in Silicon". Journal of the Electrochemical Society. 104 (9): 547. doi:10.1149/1.2428650.
  10. ^ KAHNG, D. (1961). "Silicon-Silicon Dioxide Surface Device". Technical Memorandum of Bell Laboratories: 583–596. doi:10.1142/9789814503464_0076. ISBN 978-981-02-0209-5.
  11. ^ Lojek, Bo (2007). History of Semiconductor Engineering. Berlin, Heidelberg: Springer-Verlag Berlin Heidelberg. p. 321. ISBN 978-3-540-34258-8.
  12. ^ "Who Invented the Transistor?". Computer History Museum. 4 December 2013. Retrieved 20 July 2019.
  13. ^ Hittinger, William C. (1973). "Metal-Oxide-Semiconductor Technology". Scientific American. 229 (2): 48–59. Bibcode:1973SciAm.229b..48H. doi:10.1038/scientificamerican0873-48. ISSN 0036-8733. JSTOR 24923169.
  14. ^ Fossum, Jerry G.; Trivedi, Vishal P. (2013). Fundamentals of Ultra-Thin-Body MOSFETs and FinFETs. Cambridge University Press. p. vii. ISBN 9781107434493.
  15. ^ Malmstadt, Howard V.; Enke, Christie G.; Crouch, Stanley R. (1994). Making the Right Connections: Microcomputers and Electronic Instrumentation. American Chemical Society. p. 389. ISBN 9780841228610. The relative simplicity and low power requirements of MOSFETs have fostered today's microcomputer revolution.
  16. ^ "Definition of computer". PCMAG. Retrieved 5 February 2024.
  17. ^ Denny, Jory (16 October 2020). "What is an algorithm? How computers know what to do with data". The Conversation. Retrieved 5 February 2024.
  18. ^ Butterfield, Andrew; Ngondi, Gerard Ekembe NgondiGerard Ekembe; Kerr, Anne (21 January 2016), Butterfield, Andrew; Ngondi, Gerard Ekembe; Kerr, Anne (eds.), "computer", A Dictionary of Computer Science, Oxford University Press, doi:10.1093/acref/9780199688975.001.0001, ISBN 978-0-19-968897-5, retrieved 5 February 2024
  19. ^ "Common CPU components – The CPU – Eduqas – GCSE Computer Science Revision – Eduqas – BBC Bitesize". www.bbc.co.uk. Retrieved 5 February 2024.
  20. ^ Paulson, Laurence (28 February 2018). "Computational logic: its origins and applications". Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences. 474 (2210). arXiv:1712.04375. Bibcode:2018RSPSA.47470872P. doi:10.1098/rspa.2017.0872. PMC 5832843. PMID 29507522.
  21. ^ Paulson, Lawrence C. (February 2018). "Computational logic: its origins and applications". Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences. 474 (2210): 20170872. arXiv:1712.04375. Bibcode:2018RSPSA.47470872P. doi:10.1098/rspa.2017.0872. PMC 5832843. PMID 29507522.
  22. ^ "Wordreference.com: WordNet 2.0". Princeton University, Princeton, NJ. Retrieved 19 August 2007.
  23. ^ Rouse, Margaret (March 2019). "system software". WhatIs.com. TechTarget.
  24. ^ "Basic Computer Terms". web.pdx.edu. Retrieved 18 April 2024.
  25. ^ "The Fibreculture Journal : 25 | FCJ-181 There's a History for That: Apps and Mundane Software as Commodity". Retrieved 5 February 2024.
  26. ^ "Computer network definition". Archived from the original on 21 January 2012. Retrieved 12 November 2011.
  27. ^ "TCP/IP: What is TCP/IP and How Does it Work?". Networking. Retrieved 14 March 2024.
  28. ^ Dhavaleswarapu, Ratna. (2019). The Pallid Image of Globalization in Kiran Desai's The Inheritance of Loss. Retrieved 19 April 2024.
  29. ^ "Internet | Description, History, Uses & Facts". Encyclopedia Britannica. 3 June 2024. Retrieved 7 June 2024.
  30. ^ McGee, Vanesha (8 November 2023). "What is Coding and What Is It Used For?". ComputerScience.org. Retrieved 23 June 2024.
  31. ^ Nagl, Manfred, ed. (1995). Graph-Theoretic Concepts in Computer Science. Lecture Notes in Computer Science. Vol. 1017. doi:10.1007/3-540-60618-1. ISBN 978-3-540-60618-5. ISSN 0302-9743.
  32. ^ Parsons, June (2022). "New Perspectives Computer Concepts Comprehensive | 21st Edition". Cengage. 21st edition. ISBN 9780357674819.
  33. ^ "5 Skills Developers Need Beyond Writing Code". 23 January 2019.
  34. ^ IEEE Computer Society; ACM (12 December 2004). Computer Engineering 2004: Curriculum Guidelines for Undergraduate Degree Programs in Computer Engineering (PDF). p. iii. Archived from the original (PDF) on 12 June 2019. Retrieved 17 December 2012. Computer System engineering has traditionally been viewed as a combination ofboth electronic engineering (EE) and computer science (CS).
  35. ^ Trinity College Dublin. "What is Computer System Engineering". Retrieved 21 April 2006., "Computer engineers need not only to understand how computer systems themselves work, but also how they integrate into the larger picture. Consider the car. A modern car contains many separate computer systems for controlling such things as the engine timing, the brakes and the air bags. To be able to design and implement such a car, the computer engineer needs a broad theoretical understanding of all these various subsystems & how they interact.
  36. ^ Abran, Alain; Moore, James W.; Bourque, Pierre; Dupuis, Robert; Tripp, Leonard L. (2004). Guide to the Software Engineering Body of Knowledge. IEEE. p. 1. ISBN 978-0-7695-2330-9.
  37. ^ ACM (2006). "Computing Degrees & Careers". ACM. Archived from the original on 17 June 2011. Retrieved 23 November 2010.
  38. ^ Laplante, Phillip (2007). What Every Engineer Should Know about Software Engineering. Boca Raton: CRC. ISBN 978-0-8493-7228-5. Retrieved 21 January 2011.
  39. ^ Sommerville, Ian (2008). Software Engineering (7 ed.). Pearson Education. p. 26. ISBN 978-81-7758-530-8. Retrieved 10 January 2013.
  40. ^ Peter, Naur; Randell, Brian (7–11 October 1968). Software Engineering: Report of a conference sponsored by the NATO Science Committee (PDF). Garmisch, Germany: Scientific Affairs Division, NATO. Retrieved 26 December 2008.
  41. ^ Randell, Brian (10 August 2001). "The 1968/69 NATO Software Engineering Reports". Brian Randell's University Homepage. The School of the Computer Sciences, Newcastle University. Retrieved 11 October 2008. The idea for the first NATO Software Engineering Conference, and in particular that of adopting the then practically unknown term software engineering as its (deliberately provocative) title, I believe came originally from Professor Fritz Bauer.
  42. ^ "Software Engineering – Guide to the software engineering body of knowledge (SWEBOK)". International Organization for Standardization. ISO/IEC TR 19759:2015. Retrieved 21 May 2019.
  43. ^ "WordNet Search – 3.1". Wordnetweb.princeton.edu. Retrieved 14 May 2012.
  44. ^ "The Interaction Design Foundation - What is Human-Computer Interaction (HCI)?".
  45. ^ Schatz, Daniel; Bashroush, Rabih; Wall, Julie (2017). "Towards a More Representative Definition of Cyber Security". The Journal of Digital Forensics, Security and Law. 12 (2). doi:10.15394/jdfsl.2017.1476.
  46. ^ Dhar, Vasant (2013). "Data science and prediction". Communications of the ACM. 56 (12): 64–73. doi:10.1145/2500499. ISSN 0001-0782.
  47. ^ Cao, Longbing (31 May 2018). "Data Science: A Comprehensive Overview". ACM Computing Surveys. 50 (3): 1–42. arXiv:2007.03606. doi:10.1145/3076253. ISSN 0360-0300. S2CID 207595944.
  48. ^ "Definition of Application Landscape". Software Engineering for Business Information Systems (sebis). 21 January 2009. Archived from the original on 5 March 2011. Retrieved 14 January 2011.
  49. ^ Denning, Peter (July 1999). "COMPUTER SCIENCE: THE DISCIPLINE". Encyclopaedia of Computer Science (2000 Edition). The Domain of Computer Science: Even though computer science addresses both human-made and natural information processes, the main effort in the discipline has been directed toward human-made processes, especially information processing systems and machines
  50. ^ Jessup, Leonard M.; Valacich, Joseph S. (2008). Information Systems Today (3rd ed.). Pearson Publishing. pp. –, 416.
  51. ^ "Computing Degrees & Careers " Information Systems". Association for Computing Machinery. Archived from the original on 6 July 2018. Retrieved 6 July 2018.
  52. ^ Davis, Timothy; Geist, Robert; Matzko, Sarah; Westall, James (March 2004). "τ'εχνη: A First Step". Technical Symposium on Computer Science Education: 125–129. ISBN 1-58113-798-2. In 1999, Clemson University established a (graduate) degree program that bridges the arts and the sciences... All students in the program are required to complete graduate level work in both the arts and computer science
  53. ^ Khazanchi, Deepak; Bjorn Erik Munkvold (Summer 2000). "Is information system a science? an inquiry into the nature of the information systems discipline". ACM SIGMIS Database. 31 (3): 24–42. doi:10.1145/381823.381834. ISSN 0095-0033. S2CID 52847480. From this we have concluded that IS is a science, i.e., a scientific discipline in contrast to purportedly non-scientific fields
  54. ^ "Bachelor of Information Sciences (Computer Science)". Massey University. 24 February 2006. Archived from the original on 19 June 2006. Computer Science is the study of all aspects of computer systems, from the theoretical foundations to the very practical aspects of managing large software projects
  55. ^ Polack, Jennifer (December 2009). "Planning a CIS Education Within a CS Framework". Journal of Computing Sciences in Colleges. 25 (2): 100–106. ISSN 1937-4771.
  56. ^ Hayes, Helen; Onkar Sharma (February 2003). "A decade of experience with a common first year program for computer science, information systems and information technology majors". Journal of Computing Sciences in Colleges. 18 (3): 217–227. ISSN 1937-4771. In 1988, a degree program in Computer Information Systems (CIS) was launched with the objective of providing an option for students who were less inclined to become programmers and were more interested in learning to design, develop, and implement Information Systems, and solve business problems using the systems approach
  57. ^ Freeman, Peter; Hart, David (August 2004). "A Science of Design for Software-Intensive Systems". Communications of the ACM. 47 (8): 19–21. doi:10.1145/1012037.1012054. ISSN 0001-0782. S2CID 14331332. Computer science and engineering needs an intellectually rigorous, analytical, teachable design process to ensure development of systems we all can live with ... Though the other components' connections to the software and their role in the overall design of the system are critical, the core consideration for a software-intensive system is the software itself, and other approaches to systematizing design have yet to solve the "software problem"—which won't be solved until software design is understood scientifically.
  58. ^ Daintith, John, ed. (2009), "IT", A Dictionary of Physics, Oxford University Press, ISBN 9780199233991, retrieved 1 August 2012 (subscription required)
  59. ^ "Free on-line dictionary of computing (FOLDOC)". Archived from the original on 15 April 2013. Retrieved 9 February 2013.
  60. ^ Chandler, Daniel; Munday, Rod (January 2011), "Information technology", A Dictionary of Media and Communication (first ed.), Oxford University Press, ISBN 978-0-19-956875-8, retrieved 1 August 2012 (subscription required)
  61. ^ On the later more broad application of the term IT, Keary comments- "In its original application 'information technology' was appropriate to describe the convergence of technologies with application in the broad field of data storage, retrieval, processing, and dissemination. This useful conceptual term has since been converted to what purports to be concrete use, but without the reinforcement of definition...the term IT lacks substance when applied to the name of any function, discipline, or position." Anthony Ralston (2000). Encyclopedia of computer science. Nature Pub. Group. ISBN 978-1-56159-248-7. Retrieved 12 May 2013..
  62. ^ Kershner, Ryan J.; Bozano, Luisa D.; Micheel, Christine M.; Hung, Albert M.; Fornof, Ann R.; Cha, Jennifer N.; Rettner, Charles T.; Bersani, Marco; Frommer, Jane; Rothemund, Paul W. K.; Wallraff, Gregory M. (2009). "Placement and orientation of individual DNA shapes on lithographically patterned surfaces". Nature Nanotechnology. 4 (9): 557–561. Bibcode:2009NatNa...4..557K. CiteSeerX 10.1.1.212.9767. doi:10.1038/nnano.2009.220. PMID 19734926. supplementary information: DNA origami on photolithography
  63. ^ Harlander, M. (2011). "Trapped-ion antennae for the transmission of quantum information". Nature. 471 (7337): 200–203. arXiv:1011.3639. Bibcode:2011Natur.471..200H. doi:10.1038/nature09800. PMID 21346764. S2CID 4388493.
  64. ^ Monz, Thomas (2011). "14-Qubit Entanglement: Creation and Coherence". Physical Review Letters. 106 (13): 130506. arXiv:1009.6126. Bibcode:2011PhRvL.106m0506M. doi:10.1103/PhysRevLett.106.130506. PMID 21517367. S2CID 8155660.
  65. ^ "World record: Calculations with 14 quantum bits". www.nanowerk.com.
  66. ^ Saw-Wai Hla et al., Nature Nanotechnology 31 March 2010 "World's smallest superconductor discovered" Archived 28 May 2010 at the Wayback Machine. Four pairs of certain molecules have been shown to form a nanoscale superconductor, at a dimension of 0.87 nanometers. Access date 31 March 2010
  67. ^ Tom Simonite, "Computing at the speed of light", Technology Review Wed., August 4, 2010 MIT
  68. ^ Sebastian Anthony (Dec 10,2012), "IBM creates first commercially viable silicon nanophotonic chip", accessdate=2012-12-10
  69. ^ Open Compute: Does the data center have an open future? accessdate=2013-08-11
  70. ^ "Putting electronics in a spin". 8 August 2007. Retrieved 23 November 2020.
  71. ^ "Merging spintronics with photonics" (PDF). Archived from the original (PDF) on 6 September 2019. Retrieved 6 September 2019.
  72. ^ Lalieu, M. L. M.; Lavrijsen, R.; Koopmans, B. (10 January 2019). "Integrating all-optical switching with spintronics". Nature Communications. 10 (1): 110. arXiv:1809.02347. Bibcode:2019NatCo..10..110L. doi:10.1038/s41467-018-08062-4. ISSN 2041-1723. PMC 6328538. PMID 30631067.
  73. ^ Farmakidis, Nikolaos; Youngblood, Nathan; Li, Xuan; Tan, James; Swett, Jacob L.; Cheng, Zengguang; Wright, C. David; Pernice, Wolfram H. P.; Bhaskaran, Harish (1 November 2019). "Plasmonic nanogap enhanced phase-change devices with dual electrical-optical functionality". Science Advances. 5 (11): eaaw2687. arXiv:1811.07651. Bibcode:2019SciA....5.2687F. doi:10.1126/sciadv.aaw2687. ISSN 2375-2548. PMC 6884412. PMID 31819898.
  74. ^ "The NIST Definition of Cloud Computing" (PDF). U.S. Department of Commerce. September 2011. Archived (PDF) from the original on 9 October 2022.
  75. ^ Berl, A.; Gelenbe, E.; Girolamo, M. Di; Giuliani, G.; Meer, H. De; Dang, M. Q.; Pentikousis, K. (September 2010). "Energy-Efficient Cloud Computing". The Computer Journal. 53 (7): 1045–1051. doi:10.1093/comjnl/bxp080. ISSN 1460-2067.
  76. ^ Kaufman, L. M. (July 2009). "Data Security in the World of Cloud Computing". IEEE Security Privacy. 7 (4): 61–64. doi:10.1109/MSP.2009.87. ISSN 1558-4046. S2CID 16233643.
  77. ^ Steane, Andrew (1 February 1998). "Quantum computing". Reports on Progress in Physics. 61 (2): 117–173. arXiv:quant-ph/9708022. Bibcode:1998RPPh...61..117S. doi:10.1088/0034-4885/61/2/002. ISSN 0034-4885. S2CID 119473861.
  78. ^ Horodecki, Ryszard; Horodecki, Paweł; Horodecki, Michał; Horodecki, Karol (17 June 2009). "Quantum entanglement". Reviews of Modern Physics. 81 (2): 865–942. arXiv:quant-ph/0702225. Bibcode:2009RvMP...81..865H. doi:10.1103/RevModPhys.81.865. S2CID 59577352.
  79. ^ Baiardi, Alberto; Christandl, Matthias; Reiher, Markus (3 July 2023). "Quantum Computing for Molecular Biology*". ChemBioChem. 24 (13): e202300120. arXiv:2212.12220. doi:10.1002/cbic.202300120. PMID 37151197.
[edit]