Categories
Science and Technology

Cyber security: Challenges and preparedness in India

Cyber security is the body of technologies, processes and practices designed to protect networks, computers, programs and data from attack, damage or unauthorized access. With the development of cyber technology and its overwhelming use for scientific research, delivery of services (financial and other services like e-commerce and payment to public utilities, education and e-medicine and communication among various departments and organs of government through a network, including defense has made it important to save the cyber space from attacks. Computer security is part of cyber security. Computer security, also known as cyber security or IT security is the protection of computer systems from the theft or damage to their hardware, software or information, as well as from disruption or misdirection of the services they provide.

Cyber technology is defined as a field of technology that deals with the development of artificial devices or machines that can be surgically implanted into a humanoid form to improve or otherwise augment their physical or mental abilities. Cybertechnological products are known as “Cyberware”. Cyber defense is a computer network defense mechanism which includes response to actions and critical infrastructure protection and information assurance for organizations, government entities and other possible networks.

Cyber terrorism

Cyber terrorism is the use of the Internet to conduct violent acts that result in, or threaten, loss of life or significant bodily harm, in order to achieve political gains through intimidation. It is also sometimes considered an act of Internet terrorism where terrorist activities, including acts of deliberate, large-scale disruption of computer networks, especially of personal computers attached to the Internet by means of tools such as computer viruses, computer worms or other malicious scripts are used.There is some difference between cyber crimes and cyber terrorism.

Terrorism online should be considered cyberterrorism when there has been fear inflicted on a group of people, whereas cybercrime is the act of committing a felony or crime online, typically without the use of fear, e.g. financial. By these narrow and broad definitions, it is difficult to distinguish which instances of online acclivities are cyberterrorism or cybercrime.

Cyber terrorism can be also defined as the intentional use of computers, networks, and public internet to cause destruction and harm for personal objectives. Experienced cyber terrorists, who are very skilled in terms of hacking can cause massive damage to government systems, hospital records, and national security programs, which might leave a country, community or organization in turmoil and in fear of further attacks. The objectives of such terrorists may be political or ideological since this can be considered a form of terror.

In recent years, with the massive growth of Muslim extremist activities, there has been a significant rise in exploitation of internet technologies for committing terror and cyber terror attacks against western targets. There have been several major and minor instances of cyber terrorism. Al-Qaeda utilized the internet to communicate with supporters and even to recruit new members. Estonia, a Baltic country which is constantly evolving in terms of technology, became a battleground for cyber terror in April, 2007 after disputes regarding the removal of a WWII soviet statue located in Estonia’s capital Tallinn.

Types and forms of Cyber Attacks

Social networking over the Internet has boomed in recent years because it allows networks of like-minded individuals to collaborate and connect, regardless of their respective geographies or physical location. Cyber terrorism as mentioned is a very serious issue and it covers vide range of attacks.

Some of the major tools of cyber crime may be- Botnets, Estonia, 2007, Malicious Code Hosted on Websites, Cyber Espionage etc. It is pertinent to mark here that there are other forms which could be covered under the heading of Cyber Crime & simultaneously is also an important tools for terrorist activities. Here I’m going to discuss these criminal activities one by one:

Attacks via Internet: Unauthorized access & Hacking:

one of the criminal activities is unauthorized access that would therefore mean any kind of access without the permission of either the rightful owner or the person in charge of a computer, computer system or computer network

Every act committed towards breaking into a computer and/or network is hacking. Hackers write or use ready-made computer programs to attack the target computer. They possess the desire to destruct and they get the kick out of such destruction.

Trojan Attack:

Trojan is a program that acts like something useful but do the things that are quiet damping. The programs of this kind are called as Trojans.

Trojans come in two parts, a Client part and a Server part. When the victim (unknowingly) runs the server on its machine, the attacker will then use the Client to connect to the Server and start using the trojan.

Virus and Worm attack:

A program that has capability to infect other programs and make copies of itself and spread into other programs is called virus.

Programs that multiply like viruses but spread from computer to computer are called as worms.

E-mail related crimes:

  1. Email spoofing: Email spoofing refers to email that appears to have been originated from one source when it was actually sent from another source.
  2. Email Spamming: Email “spamming” refers to sending email to thousands and thousands of users – similar to a chain letter.
  3. Sending malicious codes through email:E-mails are used to send viruses, Trojans etc through emails as an attachment or by sending a link of website which on visiting downloads malicious code.

Threat to large banks

One of the most popular forms of Cyber terrorism is to threaten a large bank. The terrorists hack into the system and then leave an encrypted message for senior directors, which threaten the bank. What adds to the difficulty to catch the criminals is that the criminals may be in another country. A second difficulty is that most banks would rather pay the money than have the public know how vulnerable they are.

Major components of cyber security

All the measures that lead to protect the data, data machine and the data systems and networks form cyber security framework. It is about both- hardware and software. It is also about regulations, laws and policies. The following are the major components of Cyber security:

  • Application Security
  • Information Security
  • Disaster recovery
  • Network Security

Application security– encompasses measures or counter-measures that are taken during the development life-cycle to protect applications from threats that can come through flaws in the application design, development, deployment, upgrade or maintenance. Some basic techniques used for application security are: a) Input parameter validation, b)User/Role Authentication & Authorization, c) Session management, parameter manipulation & exception management, and d) Auditing and logging.

Information security– protects information from unauthorized access to avoid identity theft and to protect privacy. Major techniques used to cover this are: a) Identification, authentication & authorization of user, b) Cryptography.

Disaster recovery– planning is a process that includes performing risk assessment, establishing priorities, developing recovery strategies in case of a disaster. Any business should have a concrete plan for disaster recovery to resume normal business operations as quickly as possible after a disaster.

Network security– includes activities to protect the usability, reliability, integrity and safety of the network. Effective network security targets a variety of threats and stops them from entering or spreading on the network. Network security components include: a) Anti-virus and anti-spyware, b)Firewall, to block unauthorized access to your network, c)Intrusion prevention systems (IPS), to identify fast-spreading threats, such as zero-day or zero-hour attacks, and d) Virtual Private Networks (VPNs), to provide secure remote access.

Cyber Security and India

According to , Julie Gommes, cybersecurity expert and member of the Computer Emergency Response Team, India may not be a major target like some of the European countries of Jihadi cyberterrorism, but the country remains as vulnerable to cyberterrorism as any other Jihadi-targeted country. Ms. Gommes, who led a technical session on cyberterrorism at the 9th edition of the International Cybersecurity and Policing Conference which concluded in August 2016 in Kollam, said India should be alert to such acts of terrorism. All countries remain technically vulnerable to cyberterrorism though socially India may appear to be less vulnerable. She said that websites at large continue to remain vulnerable to all kinds of cyberattacks. There is not much thrust on the security aspect of websites when they are launched. This aspect is given some thought only after an attack. But in most post-attack cases, the web page is simply cleaned and is back in minutes. Yet, no serious thought is given to cyberterrorism.

India’s vulnerability could be seen in the following threats:

  • Privacy violation
  • Data theft
  • Appropriation of government records
  • DOS/ Distributed denial of services attacks (DDOS)
  • Network damage and distruction
  • Provisions of cyber security

Laws and regulations in India against cyber crimes and terror

Information technology act

The Information Technology Act, 2000  is an Act of the Indian Parliament (No 21 of 2000) notified on 17 October 2000. Both cyber crimes and cyber terrorism are addressed by the same law.It is the primary law in India dealing with cyber crime and electronic commerce. It is based on the United Nations Model Law on Electronic Commerce 1996 (UNCITRAL Model) recommended by the General Assembly of United Nations by a resolution dated 30 January 1997. The bill was passed in the budget session of 2000 and signed by President K. R. Narayanan on 9 May 2000. The bill was finalised by group of officials headed by then Minister of Information Technology Pramod Mahajan. The original Act contained 94 sections, divided in 13 chapters and 4 schedules. The laws apply to the whole of India. Persons of other nationalities can also be indicted under the law, if the crime involves a computer or network located in India.

The Act provides legal framework for electronic governance by giving recognition to electronic records and digital signatures. The formations of Controller of Certifying Authorities was directed by the Act, to regulate issuing of digital signatures. It also defines cyber crimes and prescribed penalties for them. It also established a Cyber Appellate Tribunal to resolve disputes rising from this new law. The Act also amended various sections of Indian Penal Code, 1860, Indian Evidence Act, 1872, Banker’s Book Evidence Act, 1891, and Reserve Bank of India Act, 1934 to make them compliant with new technologies. A major amendment was made in 2008. It introduced the Section 66A which penalized sending of “offensive messages”. It also introduced the Section 69, which gave authorities the power of “interception or monitoring or decryption of any information through any computer resource”. It also introduced penalties for child porn, cyber terrorism and voyeurism. It was passed on 22 December 2008 without any debate in Lok Sabha. The next day it was passed by the Rajya Sabha. It was signed by the then President (Pratibha Patil) on 5 February 2009.

Offences under the IT Act

List of offences and the corresponding penalties:

  • Section 65– Tampering with computer source documents- If a person knowingly or intentionally conceals, destroys or alters or intentionally or knowingly causes another to conceal, destroy or alter any computer source code used for a computer, computer programme, computer system or computer network, when the computer source code is required to be kept or maintained by law for the time being in force—– Imprisonment up to three years, or/and with fine up to Rs. 200,000
  • Section 66– Hacking with computer system–If a person with the intent to cause or knowing that he is likely to cause wrongful loss or damage to the public or any person destroys or deletes or alters any information residing in a computer resource or diminishes its value or utility or affects it injuriously by any means, commits hack—— Imprisonment up to three years, or/and with fine up to Rs. 500,000
  • Section 66 B-– Receiving stolen computer or communication device—- A person receives or retains a computer resource or communication device which is known to be stolen or the person has reason to believe is stolen—- Imprisonment up to three years, or/and with fine up to Rs. 100,000
  • Section 66 C–Using password of another person— A person fradulently uses the password, digital signature or other unique identification of another person—- Imprisonment up to three years, or/and with fine up to Rs.100,000
  • Section 66 D— Cheating using computer resource—- If a person cheats someone using a computer resource or communication—– Imprisonment up to three years, or/and with fine up to Rs.100,000
  • Section 66 E— Publishing private images of others—- If a person captures, transmits or publishes images of a person’s private parts without his/her consent or knowledge—- Imprisonment up to three years, or/and with fine up to Rs.200,000
  • Section 66 F— Acts 66F of cyber terrorism—If a person denies access to an authorised personnel to a computer resource, accesses a protected system or introduces contaminant into a system, with the intention of threatening the unity, integrity, sovereignty or security of India, then he commits cyber terrorism—– Imprisonment up to life
  • Section 67— Publishing information which is obscene in electronic form—- Publishing information which is obscene in electronic form—- If a person publishes or transmits or causes to be published in the electronic form, any material which is lascivious or appeals to the prurient interest or if its effect is such as to tend to deprave and corrupt persons who are likely, having regard to all relevant circumstances, to read, see or hear the matter contained or embodied in it—– Imprisonment up to five years, or/and with fine up to Rs. 1,000,000
  • Section 67 A— Publishing images containing sexual acts—- Persons deemed as intermediatary (such as an ISP) must maintain required records for stipulated time. Failure is an offence— Imprisonment up to three years, or/and with fine
  • Section 67 B— Publishing child porn or predating children online— If a person captures, publishes or transmits images of a child in a sexually explicit act or conduct. If a person induces a child into a sexual act. A child is defined as anyone under 18—– Imprisonment up to five years, or/and with fine up to Rs.1,000,000 on first conviction. Imprisonment up to seven years, or/and with fine up to Rs. 1,000,000 on second conviction.
  • Section 67 C — Failure to maintain records— Persons deemed as intermediatary (such as an ISP) must maintain required records for stipulated time. Failure is an offence.—– Imprisonment up to three years, or/and with fine
  • Section 68— Failure/refusal to comply with orders—- The Controller may, by order, direct a Certifying Authority or any employee of such Authority to take such measures or cease carrying on such activities as specified in the order if those are necessary to ensure compliance with the provisions of this Act, rules or any regulations made thereunder. Any person who fails to comply with any such order shall be guilty of an offence—- Imprisonment up to three years, or/and with fine up to Rs. 200,000
  • Section 69— Failure/refusal to decrypt data—- If the Controller is satisfied that it is necessary or expedient so to do in the interest of the sovereignty or integrity of India, the security of the State, friendly relations with foreign Stales or public order or for preventing incitement to the commission of any cognizable offence, for reasons to be recorded in writing, by order, direct any agency of the Government to intercept any information transmitted through any computer resource. The subscriber or any person in charge of the computer resource shall, when called upon by any agency which has been directed, must extend all facilities and technical assistance to decrypt the information. The subscriber or any person who fails to assist the agency referred is deemed to have committed a crime.—- Imprisonment up to seven years and possible fine.
  • Section 70-— Securing access or attempting to secure access to a protected system—— The appropriate Government may, by notification in the Official Gazette, declare that any computer, computer system or computer network to be a protected system. The appropriate Government may, by order in writing, authorise the persons who are authorised to access protected systems. If a person who secures access or attempts to secure access to a protected system, then he is committing an offence.—– Imprisonment up to ten years, or/and with fine
  • Section 71-—Misrepresentation—— If anyone makes any misrepresentation to, or suppresses any material fact from, the Controller or the Certifying Authority for obtaining any license or Digital Signature Certificate—–Imprisonment up to three years, or/and with fine up to Rs. 100,000

Application of the IT Act

Some of the section s of the IT Act of India were on and off applied in certain cases. For example Section 66 was applied in case in February 2001, in one of the first cases, the Delhi police arrested two men running a web-hosting company. The company had shut down a website over non-payment of dues. The owner of the site had claimed that he had already paid and complained to the police. But the most used and criticized for abusive use was section 66A. It was used in September 2012, in case of a freelance cartoonist Aseem Trivedi who was arrested under Section 66A of the IT Act, Section 2 of Prevention of Insults to National Honour Act, 1971 and for sedition under the Section 124 of the Indian Penal Code. His cartoons depicting widespread corruption in India were considered offensive. Again on 12 April 2012, a Chemistry professor from Jadavpur University, Ambikesh Mahapatra, was arrested for sharing a cartoon of West Bengal Chief Minister Mamata Banerjee and then Railway Minister Mukul Roy.

On 30 October 2012, a Puducherry businessman Ravi Srinivasan was arrested under Section 66A. He had sent tweet accusing Karti Chidambaram, son of then Finance Minister P. Chidambaram, of corruption. Karti Chidambaram had complained to the police. On 19 November 2012, a 21-year-old girl was arrested from Palghar for posting a message on Facebook criticising the shutdown in Mumbai for the funeral of Bal Thackeray.  On 18 March 2015, a teenaged boy was arrested from Bareilly, Uttar Pradesh, for making a post on Facebook insulting politician Azam Khan. The post allegedly contained hate speech against a community and was falsely attributed to Azam Khan by the boy.

Controversy on Section 66A

From the above examples we see how powerful people influenced police to use section 66 A of the IT Act to gaga people and snatch their freedom of speech and opinion. Therefore, in December 2012, P Rajeev, a Rajya Sabha member from Kerala, tried to pass a resolution seeking to amend the Section 66A. He was supported by D. Bandyopadhyay, Gyan Prakash Pilania, Basavaraj Patil Sedam, Narendra Kumar Kashyap, Rama Chandra Khuntia and Baishnab Charan Parida. P Rajeev pointed that cartoons and editorials allowed in traditional media, were being censored in the new media. He also said that law was barely debated before being passed in December 2008. In November 2012, IPS officer Amitabh Thakur and his wife social activist Nutan Thakur, filed a petition in the Lucknow bench of the Allahabad High Court claiming that the Section 66A violated the freedom of speech guaranteed in the Article 19(1)(a) of the Constitution of India. They said that the section was vague and frequently misused.

Also in November 2012, a Delhi-based law student, Shreya Singhal, filed a Public Interest Litigation (PIL) in the Supreme Court of India. She argued that the Section 66A was vaguely phrased, as result it violated Article 14, 19 (1)(a) and Article 21 of the Constitution. The PIL was accepted on 29 November 2012. A similar petition was also filed by the founder of MouthShut.com, Faisal Farooqui, and NGO Common Cause represented by Prashant Bhushan. In August 2014, the Supreme Court asked the central government to respond to petitions filed by Mouthshut.com and later petition filed by the Internet and Mobile Association of India (IAMAI) which claimed that the IT Act gave the government power to arbitrarily remove user-generated content

On 24 March 2015, the Supreme Court of India, gave the verdict that Section 66A is unconstitutional in entirety. The court said that Section 66A of IT Act 2000 is “arbitrarily, excessively and disproportionately invades the right of free speech” provided under Article 19(1) of the Constitution of India. But the Court turned down a plea to strike down sections 69A and 79 of the Act, which deal with the procedure and safeguards for blocking certain websites.

Threat to banks

Wanna Decryptor or WannaCry, a ransomeware attacked banks and individual high net worth people in the first half of 2017. The countries affected by a global cyber attack that took down, among others, health services in the UK, a telecom network in Spain and government computer systems in Russia in May 2017. In the same minth, as many as 102 computer systems of Andhra Pradesh police were hacked . The malware reportedly halted production at a Nissan-Renault Alliance plant on the outskirts of Chennai, but the company did not comment on the issue. National Cyber Security Adviser in the Prime Minister’s Office Gulshan Rai however halted the panic by saying that about 100 systems were attacked but after mid-may there were no more threats. The international cyber attack was carried out using a malware called Wanna Decryptor or WannaCry. This is a “ransomware“, a digital extortion system that locks down systems by encrypting the data on it, only to decrypt and release it back for a ransom amount. What was more worrying about the global cyber attack was the fact that the outdated Windows XP version that turned out to be the weak link, crippling information systems around the world, is used by 70% of Indian ATMs. According to a Microsoft spokesperson, Their complete control rests with vendors who provide banks with these systems. Microsoft stopped providing support -security patches and other tools -for Windows XP in 2014. However, on Saturday, Microsoft said it had released updates for older systems. “Given the potential impact to customers and their businesses, we have also released updates for Windows XP, Windows 8, and Windows Server 2003.

The threat of cybercrime on the global banking and financial services industry is apparent with a tectonic increase in cases over the past few years. The disruptive force of technology has proved to be a double-edged sword, with the quantum of cyber-attacks intensifying with time in the banking sector. For instance, ‘Zeus Trojan’, a type of malware wreaked havoc on the internet about a decade back, stealing the banking credentials of users. ‘Cryptolocker’, a type of ransomware was then discovered which could encrypt critical files on the system and demand a ransom (typically in Bitcoins) in exchange for the decryption key. More recently, a lethal ransomware known as ‘Mamba’ has caused panic across the world. This is because instead of the just encrypting critical files, ‘Mamba’ encrypts the entire hard disk drive, including the bootloader.

Phishing, another form of attack led by social engineering, is targeting consumers who may fall prey to a fake but ‘genuine-looking’ bank website, and eventually offer credentials to a hacker. The hacker would then use the credentials to log into the original bank account and transfer funds fraudulently. Distributed denial of service (DDoS) attacks using devices connected to internet such as CCTV cameras and mobile phones have also enabled them to potentially deny internet access to an entire country.

Historically, telex (also known as TT or Telegraphic Transfer) has been the legacy electronic method used to send overseas payment instructions taking place between financial institutions. While it was a popular means, there were loopholes in the security systems. The need to have an easier as well as a secure system emerged, which would be simple and safe and maintain integrity of the data exchanged. Telex was eventually replaced by a newer and more reliable method, created by non-profit organizations known as Society for Worldwide Interbank Financial Telecommunication (SWIFT) in 1977. SWIFT gained instant popularity and by 1979, it was already handling more than 1.2 lakh messages per day. Today, SWIFT’s messaging services are used extensively by more than 11,000 financial institutions in over 200 countries.

In recent times, cyber criminals have shifted their focus to targeting critical banking infrastructure. Created with the intent to provide security and reliability to banks, the repercussions of successfully breaching SWIFT systems could be hazardous. Unlike banking Trojans and ransomware, where each hack would yield thousands of dollars, each SWIFT hack could potentially cost banks millions of dollars. Media reports have suggested such cases in Asia as well as Europe. Keeping aside the reported cases, there is a fair probability that more attacks may have occurred but would have gone unreported due to possible reputational damage feared by the institutions. Typically, hackers would send fraudulent payment instructions impersonating the operator of a financial institution. They would then manipulate or wipe off some data to mask any trail so the hack becomes untraceable.

To proactively manage the vulnerabilities that could be exploited by hackers, patches and updates have been rolled out by SWIFT. However, as the compromise often involves internal systems, such steps may not necessarily solve all the problems for an organization.

Additionally, the Reserve Bank of India (RBI) has released a set of guidelines to manage the risks associated with such attacks. RBI’s circular last year covered several notable suggestions, ranging from arrangements for continuous surveillance, creation of a cyber security policy that is distinct from the broader IT policy and an immediate assessment of gaps in preparedness to be reported to the regulator. To diminish future risks and fortify safety mechanisms, institutions using global payment services should conduct a complete security review of their IT infrastructure. Lastly, a proactive forensic analysis of all the systems may be beneficial to ascertain if there has already been a breach or compromise.

Categories
Science and Technology

Breakthrough in race to commercialize quantum computers

Researchers at International Business Machines Corp have developed a new approach for simulating molecules on a quantum computer. The breakthrough, outlined in a research paper to be published in the scientific journal Nature Thursday, uses a technique that could eventually allow quantum computers to solve difficult problems in chemistry and electro-magnetism that cannot be solved by even the most powerful supercomputers today.

In the experiments described in the paper, IBM researchers used a quantum computer to derive the lowest energy state of a molecule of beryllium hydride. Knowing the energy state of a molecule is a key to understanding chemical reactions. In the case of beryllium hydride, a supercomputer can solve this problem, but the standard techniques for doing so cannot be used for large molecules because the number of variables exceeds the computational power of even these machines. The IBM researchers created a new algorithm specifically designed to take advantage of the capabilities of a quantum computer that has the potential to run similar calculations for much larger molecules, the company said.

The problem with existing quantum computers – including the one IBM used for this research, is that they produce errors and as the size of the molecule being analyzed grows, the calculation strays further and further from chemical accuracy. The inaccuracy in IBM’s experiments varied between 2 and 4 percent, Jerry Chow, the manager of experimental quantum computing for IBM, said in an interview.

Alan Aspuru-Guzik, a professor of chemistry at Harvard University who was not part of the IBM research, said that the Nature paper is an important step. “The IBM team carried out an impressive series of experiments that holds the record as the largest molecule ever simulated on a quantum computer,” he said. But Aspuru-Guzik said that quantum computers would be of limited value until their calculation errors can be corrected.

“When quantum computers are able to carry out chemical simulations in a numerically exact way, most likely when we have error correction in place and a large number of logical qubits, the field will be disrupted,” he said in a statement. He said applying quantum computers in this way could lead to the discovery of new pharmaceuticals or organic materials. IBM has been pushing to commercialize quantum computers and recently began allowing anyone to experiment with running calculations on a 16-qubit quantum computer it has built to demonstrate the technology.

In a classical computer, information is stored using binary units, or bits. A bit is either a 0 or 1. A quantum computer instead takes advantage of quantum mechanical properties to process information using quantum bits, or qubits. A qubit can be both a 0 or 1 at the same time, or any range of numbers between 0 and 1. Also, in a classical computer, each logic gate functions independently. In a quantum computer, the qubits affect one another. This allows a quantum computer, in theory, to process information far more efficiently than a classical computer.

The machine IBM used for the Nature paper consisted of seven quibits created from supercooled superconducting materials. In the experiment, six of these quibits were used to map the energy states of the six electrons in the beryllium hydride molecule. Rather than providing a single, precise and accurate answer, as a classical computer does, a quantum computer must run a calculation hundreds of times, with an average used to arrive at a final answer. Chow said his team is currently working to improve the speed of its quantum computer with the aim of reducing the time it takes to run each calculation from seconds to microseconds. He said they were also working on ways to reduce its error rate.

IBM is not the only company working on quantum computing. Alphabet Inc’s Google is working toward creating a 50 qubit quantum computer. The company has pledged to use this machine to solve a previously unsolvable calculation from chemistry or electro-magnetism by the end of the year. Also competing to commercialize quantum computing is Rigetti Computing, a startup in Berkeley, California, which is building its own machine, and Microsoft Corp which is working with an unproven quantum computing architecture that is, in theory, inherently error-free. D-Wave Systems Inc, a Canadian company, is currently the only company to sell quantum computers, although its machines can only be used to solve certain optimization problems.

What is quantum computing?

Quantum computing studies theoretical computation systems (quantum computers) that make direct use of quantum-mechanical phenomena, such as superposition and entanglement, to perform operations on data. Quantum computers are different from binary digital electronic computers based on transistors. Whereas common digital computing requires that the data be encoded into binary digits (bits), each of which is always in one of two definite states (0 or 1), quantum computation uses quantum bits, which can be in super positions of states. A quantum Turing machine is a theoretical model of such a computer, and is also known as the universal quantum computer. The field of quantum computing was initiated by the work of Paul Benioff and Yuri Manin in 1980, Richard Feynman in 1982, and David Deutsch in 1985. A quantum computer with spins as quantum bits was also formulated for use as a quantum spacetime in 1968.

Latest Status of Quantum computing

As of 2017, the development of actual quantum computers is still in its infancy, but experiments have been carried out in which quantum computational operations were executed on a very small number of quantum bits. Both practical and theoretical research continues, and many national governments and military agencies are funding quantum computing research in additional effort to develop quantum computers for civilian, business, trade, environmental and national security purposes, such as cryptanalysis. A small 5-qubit quantum computer exists and is available for hobbyists to experiment with via the IBM quantum experience project.

Uses of Quantum Computing

Large-scale quantum computers would theoretically be able to solve certain problems much more quickly than any classical computers that use even the best currently known algorithms, like integer factorization using Shor’s algorithm or the simulation of quantum many-body systems. There exist quantum algorithms, such as Simon’s algorithm, that run faster than any possible probabilistic classical algorithm. A classical computer could in principle (with exponential resources) simulate a quantum algorithm, as quantum computation does not violate the Church–Turing thesis. On the other hand, quantum computers may be able to efficiently solve problems which are not practically feasible on classical computers.

Categories
Science and Technology

The importance of big data: Constraints and possibilities

We are living in an information age. Information is equally important for various stake holders in a society- from academics, business, commerce, science and research to measurement of socio-economic and climatic change and maintenance of law and order, big data has immense utility.  Today there is an increasing tendency of using big data for analysis. Such an analysis of big data sets can find new correlations to “spot business trends, prevent diseases, combat crime and so on. Scientists, business executives, practitioners of medicine, advertising and governments alike regularly meet difficulties with large data-sets in areas including Internet search, finance, urban informatics, and business informatics. Scientists encounter limitations in e-Science work, including meteorology, genomics, connectomics, complex physics simulations, biology and environmental research.

What is big data?

Big data refers to extremely large data sets that may be analysed computationally to reveal patterns, trends, and associations, especially relating to human behaviour and interactions. Big data is a term that describes the large volume of data – both structured and unstructured – that inundates a business on a day-to-day basis. But it’s not the amount of data that’s important. Big data can be analyzed for insights that lead to better decisions and strategic business moves. However, the traditional data processing application software is inadequate to deal with big data. There are umpteen challenges in handling, processing and analyzing the big data. They include capture, storage, analysis, data curation, search, sharing, transfer, visualization, querying, and updating and information privacy. The term “big data” often refers simply to the use of predictive analytics, user behavior analytics, or certain other advanced data analytics methods that extract value from data, and seldom to a particular size of data set. “There is little doubt that the quantities of data now available are indeed large, but that’s not the most relevant characteristic of this new data ecosystem.

According to a 2011 McKinsey Global Institute report the main characteristics of big data are (1)  Techniques for analyzing data, such as A/B testing, machine learning and natural language processing, (2) Big data technologies, like business intelligence, cloud computing and databases and (3) Visualization, such as charts, graphs and other displays of the data

Application of big data

It is said that Big data analysis was in part responsible for the BJP to win the Indian General Election 2014 in India. There are many areas where today big data is used or has immense potential to be used. Developed economies increasingly use data-intensive technologies. There are 4.6 billion mobile-phone subscriptions worldwide, and between 1 billion and 2 billion people accessing the internet. Between 1990 and 2005, more than 1 billion people worldwide entered the middle class, which means more people became more literate, which in turn lead to information growth. The world’s effective capacity to exchange information through telecommunication networks was 281 petabytes in 1986, 471 petabytes in 1993, 2.2 exabytes in 2000, 65 exabytes in 2007 and predictions put the amount of internet traffic at 667 exabytes annually by 2014. According to one estimate, one third of the globally stored information is in the form of alphanumeric text and still image data, which is the format most useful for most big data applications. This also shows the potential of yet unused data (i.e. in the form of video and audio content).While many vendors offer off-the-shelf solutions for big data, experts recommend the development of in-house solutions custom-tailored to solve the company’s problem at hand if the company has sufficient technical capabilities.

The use and adoption of big data within governmental processes allows efficiencies in terms of cost, productivity, and innovation. Research on the effective usage of information and communication technologies for development (also known as ICT4D) suggests that big data technology can make important contributions but also present unique challenges to International development. Advancements in big data analysis offer cost-effective opportunities to improve decision-making in critical development areas such as health care, employment, economic productivity, crime, security, and natural disaster and resource management. Based on TCS 2013 Global Trend Study, improvements in supply planning and product quality provide the greatest benefit of big data for manufacturing. Big data provides an infrastructure for transparency in manufacturing industry, which is the ability to unravel uncertainties such as inconsistent component performance and availability. Predictive manufacturing as an applicable approach toward near-zero downtime and transparency requires vast amount of data and advanced prediction tools for a systematic process of data into useful information. Big data analytics has helped healthcare improve by providing personalized medicine and prescriptive analytics, clinical risk intervention and predictive analytics, waste and care variability reduction, automated external and internal reporting of patient data, standardized medical terms and patient registries and fragmented point solutions. Big dtat can also be helpful in a great way in the field of education. A McKinsey Global Institute study found a shortage of 1.5 million highly trained data professionals and managers and a number of universities including University of Tennessee and UC Berkeley, have created masters programs to meet this demand. Private bootcamps have also developed programs to meet that demand, including free programs like The Data Incubator or paid programs like General Assembly. Big data is a boon even for media. To understand how the media utilises big data, it is first necessary to provide some context into the mechanism used for media process. It has been suggested by Nick Couldry and Joseph Turow that practitioners in Media and Advertising approach big data as many actionable points of information about millions of individuals. The industry appears to be moving away from the traditional approach of using specific media environments such as newspapers, magazines, or television shows and instead taps into consumers with technologies that reach targeted people at optimal times in optimal locations. The ultimate aim is to serve, or convey, a message or content that is (statistically speaking) in line with the consumer’s mindset. For example, publishing environments are increasingly tailoring messages (advertisements) and content (articles) to appeal to consumers that have been exclusively gleaned through various data-mining activities. Big data can be used to improve training and understanding competitors, using sport sensors. It is also possible to predict winners in a match using big data analytics. Future performance of players could be predicted as well.

Flow of big data and its management

Due to cheap availability and advent of many new tools to access Data sets, data has grown in the last two decades with an unprecedented speed and volume. Data are today increasingly gathered by cheap and numerous information-sensing mobile devices, aerial (remote sensing), software logs, cameras, microphones, radio-frequency identification (RFID) readers and wireless sensor networks. The world’s technological per-capita capacity to store information has roughly doubled every 40 months since the 1980s as of 2012, every day 2.5 exabytes (2.5×1018) of data are generated.

Architecture and new technology to support big data analysis

Since the beginning of the new millennium there are ongoing efforts to develop methods for managing the big data. In 2000, Seisint Inc. (now LexisNexis Group) developed a C++-based distributed file-sharing framework for data storage and query. The system stores and distributes structured, semi-structured, and unstructured data across multiple servers. Users can build queries in a C++ dialect called ECL. ECL uses an “apply schema on read” method to infer the structure of stored data when it is queried, instead of when it is stored. In 2004, LexisNexis acquired Seisint Inc.[35] and in 2008 acquired ChoicePoint, Inc. and their high-speed parallel processing platform. The two platforms were merged into HPCC (or High-Performance Computing Cluster) Systems and in 2011, HPCC was open-sourced under the Apache v2.0 License. Quantcast File System was available about the same time.

In 2004, Google published a paper on a process called MapReduce that uses a similar architecture. The MapReduce concept provides a parallel processing model, and an associated implementation was released to process huge amounts of data. With MapReduce, queries are split and distributed across parallel nodes and processed in parallel (the Map step). The results are then gathered and delivered (the Reduce step). The framework was very successful, so others wanted to replicate the algorithm. Therefore, an implementation of the MapReduce framework was adopted by an Apache open-source project named Hadoop.

MIKE2.0 is an open approach to information management that acknowledges the need for revisions due to big data implications identified in an article titled “Big Data Solution Offering”. The methodology addresses handling big data in terms of useful permutations of data sources, complexity in interrelationships, and difficulty in deleting (or modifying) individual records.

Recent Developments in big data management

2012 studies showed that multiple-layer architecture is one option to address the issues that big data presents. A distributed parallel architecture distributes data across multiple servers; these parallel execution environments can dramatically improve data processing speeds. This type of architecture inserts data into a parallel DBMS, which implements the use of MapReduce and Hadoop frameworks. This type of framework looks to make the processing power transparent to the end user by using a front-end application server. Big data analytics for manufacturing applications is marketed as a 5C architecture (connection, conversion, cyber, cognition, and configuration). The data lake allows an organization to shift its focus from centralized control to a shared model to respond to the changing dynamics of information management. This enables quick segregation of data into the data lake, thereby reducing the overhead time.

Multidimensional big data can also be represented as tensors, which can be more efficiently handled by tensor-based computation, such as multilinear subspace learning. Additional technologies being applied to big data include massively parallel-processing (MPP) databases, search-based applications, data mining, distributed file systems, distributed databases, cloud-based infrastructure (applications, storage and computing resources) and the Internet. Some but not all MPP relational databases have the ability to store and manage petabytes of data. Implicit is the ability to load, monitor, back up, and optimize the use of the large data tables in the RDBMS.

Issues in real time Sharing of big data

The practitioners of big data analytics processes are generally hostile to slower shared storage, preferring direct-attached storage (DAS) in its various forms from solid state drive (Ssd) to high capacity SATA disk buried inside parallel processing nodes. The perception of shared storage architectures—Storage area network (SAN) and Network-attached storage (NAS) —is that they are relatively slow, complex, and expensive. These qualities are not consistent with big data analytics systems that thrive on system performance, commodity infrastructure, and low cost.

Real or near-real time information delivery is one of the defining characteristics of big data analytics. Latency is therefore avoided whenever and wherever possible. Data in memory is good—data on spinning disk at the other end of a FC SAN connection is not. The cost of a SAN at the scale needed for analytics applications is very much higher than other storage techniques. There are advantages as well as disadvantages to shared storage in big data analytics, but big data analytics practitioners as of 2011 did not favour it.

Big data, Internet of Things (IoT) and Technology

Big data and the IoT work in conjunction. Data extracted from IoT devices provides a mapping of device inter-connectivity. Such mappings have been used by the media industry, companies and governments to more accurately target their audience and increase media efficiency. IoT is also increasingly adopted as a means of gathering sensory data, and this sensory data has been used in medical  and manufacturing  contexts. There are many ecommerce and social media platforms which are using big data with their own system of management of big data. . eBay.com uses two data warehouses at 7.5 petabytes and 40PB as well as a 40PB Hadoop cluster for search, consumer recommendations, and merchandising. Amazon.com handles millions of back-end operations every day, as well as queries from more than half a million third-party sellers. The core technology that keeps Amazon running is Linux-based and as of 2005 they had the world’s three largest Linux databases, with capacities of 7.8 TB, 18.5 TB, and 24.7 TB. Facebook handles 50 billion photos from its user base. Google was handling roughly 100 billion searches per month as of August 2012. Oracle NoSQL Database has been tested to past the 1M ops/sec mark with 8 shards and proceeded to hit 1.2M ops/sec with 10 shards.

Difficulties in handling big data

There are many problems in the handling of big data. There are technical issues already discussed above. Apart from technical issues there are other issues pertaining to accuracy of predictions and analysis because of behavioral issues. Relational database management systems and desktop statistics- and visualization-packages often have difficulty handling big data. The work may require “massively parallel software running on tens, hundreds, or even thousands of servers”. What counts as “big data” varies depending on the capabilities of the users and their tools, and expanding capabilities make big data a moving target. “For some organizations, facing hundreds of gigabytes of data for the first time may trigger a need to reconsider data management options. For others, it may take tens or hundreds of terabytes before data size becomes a significant consideration. One question for large enterprises is determining who should own big-data initiatives that affect the entire organization.

 Snijders, Matzat, and Reips point out that often very strong assumptions are made about mathematical properties of the big data in analysis that may not at all reflect what is really going on at the level of micro-processes.  Mark Graham has leveled broad critiques at Chris Anderson’s assertion that big data will spell the end of theory: focusing in particular on the notion that big data must always be contextualized in their social, economic, and political contexts. Even as companies invest eight- and nine-figure sums to derive insight from information streaming in from suppliers and customers, less than 40% of employees have sufficiently mature processes and skills to do so. To overcome this insight deficit, big data, no matter how comprehensive or well analysed, must be complemented by “big judgment,” according to an article in the Harvard Business Review. Much in the same line, it has been pointed out that the decisions based on the analysis of big data are inevitably “informed by the world as it was in the past, or, at best, as it currently is”. Privacy advocates are concerned about the threat to privacy represented by increasing storage and integration of personally identifiable information; expert panels have released various policy recommendations to conform practice to expectations of privacy. Nayef Al-Rodhan argues that a new kind of social contract will be needed to protect individual liberties in a context of Big Data and giant corporations that own vast amounts of information. The use of Big Data should be monitored and better regulated at the national and international levels. Ulf-Dietrich Reips and Uwe Matzat wrote in 2014 that big data had become a “fad” in scientific research. Researcher Danah Boyd has raised concerns about the use of big data in science neglecting principles such as choosing a representative sample by being too concerned about actually handling the huge amounts of data. This approach may lead to results bias in one way or another. Nonetheless, the use of big data in the information age is indispensable. It will depend on technology and methods of handling the big data as well as motives behind the analysis which would determine the end results. The issue of privacy and liberty would remain to be big questions and also what would happen if a cyber warfare leads to collapse of the big data.

Categories
Science and Technology

ISRO launches 104 satellites in one go: Establishes its credentials as low cost and efficient commercial launcher

The Indian Space Research Organisation (ISRO) set a new record in space mission achievements after it successfully launched 104 satellites in desired orbits one go from the Satish Dhawan Space Centre in Sriharikota, Andhra Pradesh, on February 2017morning. This was ISRO’s first space mission for the year 2017, and the most complicated mission it has ever carried out. The feat was performed on the old reliable launch vehicle, the PSLV, numbered C-37, which took off from the first launch pad at the Satish Dhawan Space Centre, Sriharikota, at 9.28 a.m. C-37 was a largely commercial flight as all but three passenger satellites, small nanosats, belonged to six other countries. PSLV Project Director B. Jayakumar at the Vikram Sarabhai Space Centre (VSSC) said “It is confirmed that all 104 satellites have been successfully deployed in the orbit.” ISRO said that after separation, the two solar arrays of Cartosat-2 series satellite were deployed automatically and ISRO Telemetry, Tracking and Command Network (ISTRAC) at Bengaluru took over the control of the satellite. In the coming days, the satellite will be brought to its final operational configuration.

According to ISRO the 29-minute mega-payload launch went off precisely as planned; it took just 11 minutes from the release of the primary Cartosat-2 series spacecraft to the last launch of a client satellite. The PSLV, in the category of launch vehicles that can lift relatively light loads to space, now marks 38 successful missions in a row out of a total of 39 flights. This time, it took to space a total of 1,378 kg, of which the primary satellite was 714 kg.

The latest Cartosat is the fifth in the series of six Cartosat-2 spacecraft, starting from Cartosat 2 in 2007 and followed by what were earlier marked A, B, C, D and E. The last one is due. An official communique said, “After a flight of 16 minutes and 48 seconds, the satellites achieved a polar Sun synchronous orbit of 506 km inclined at an angle of 97.46 degrees to the equator — very close to the intended orbit. In the next 12 minutes, all 104 satellites successfully separated from the PSLV fourth stage in a predetermined sequence, beginning with the Cartosat-2 series, INS-1 and INS-2.” After a flight of 16 minutes and 48 seconds, the satellites achieved a polar Sun synchronous orbit of 506 km inclined at an angle of 97.46 degrees to the equator — very close to the intended orbit. In the next 12 minutes, all 104 satellites successfully separated from the PSLV fourth stage in a predetermined sequence, beginning with the Cartosat-2 series, INS-1 and INS-2.

The PSLV, which created launch history by placing a record 104 spacecraft in their desired orbits, has totally launched 46 Indian spacecraft, most of them Indian Remote Sensing (IRS) satellites. As many as 180 small satellites of foreign customers contracted by ISRO’s commercial company Antrix Corporation have also reached space on this vehicle. This time, it took to space a total of 1,378 kg, of which the primary satellite was 714 kg. The latest Cartosat is the fifth in the series of six Cartosat-2 spacecraft, starting from Cartosat 2 in 2007 and followed by what were earlier marked A, B, C, D and E. The last one is due. The PSLV, 39 flights old since 1993, also launched the Indian Moon mission Chandrayaan-1 in 2008; and is set to launch a private lunar mission for Bengaluru start-up Team Indus in late December this year.

PSLV first launched the 714 kg Cartosat-2 Series satellite for earth observation, followed by the INS-1A and INS-1B, after it reached the polar Sun Synchronous Orbit. It then went on to inject 103 co-passenger satellites, together weighing about 664 kg, in pairs.ISRO scientists used the XL Variant – the most powerful rocket – earlier used in the ambitious Chandrayaan and during the Mars Orbiter Mission (MOM).

Till 2016, Isro had launched 75 foreign satellites, out of a total of 121. Forex revenues of Isro’s commercial arm, Antrix Corporation, rose 204.9% in 2015  on the strength of foreign satellite launches. In 2015-16, commercial launches brought in Rs230 crore, which was 4% of Isro’s average spending over the previous three years. Isro’s low prices are drawing foreign clients. Private space programmes such as Arianespace and SpaceX are yet to equal its cost-effectiveness. Besides, Isro has a 100% success rate in foreign satellite launches. While a satellite launch on Arianespace’s rocket will cost about $100 million after subsidies, SpaceX will charge $60 million. In contrast, Isro charged an average $3 million per satellite between 2013 and 2015.

Categories
Science and Technology

NASA discovered 7 Earth-like planets

Scientists of NASA announced discovery of an entire system of Earth-sized planets. In these seven planets, six innermost are rocky like earth.

Three of the planets lie in the star’s habitable zone. The habitable zone or goldilocks zone is the region surrounding a star in which liquid water could theoretically exist. This means that all three of these planets may have presence of water. This indication dramatically increased the possibility of life. This is the largest number of Earth-sized planets yet found which could support liquid water.

As per Scientists, the star in this system-TRAPPIST-1 is an “ultra cool dwarf”. The energy output from dwarf stars like TRAPPIST-1 is much weaker than that of our Sun. Planets would need to be in far closer orbits than we see in the Solar System if there is to be surface water. Fortunately, it seems that this kind of compact configuration is just what we see around TRAPPIST-1.”

Michael Gillon leads the TRAPPIST collaboration, which hunts for planets using two 60-centimetre telescopes of Chile and Morocco. Gillon said that the team initially reported three planets around the star, known as TRAPPIST-1, last May. Further research revealed that it was not a single planet but four that orbit Trappist-1 roughly every 4, 6, 9 and 12 days. Those four joined the two innermost planets, which whirl around the star once every 1.5 days and 2.4 days. The team also find a seventh, more distant planet.

The six inner planets probably formed farther away from their star and then migrated inward. Now, they are so close to each other that their gravitational fields interact that enabled the team to estimate each planet’s mass, which they range from around 0.4 to 1.4 times the mass of the Earth.

Distance from Our Earth

The system is just 40 light-years away. Although, it would take us millions of years to get there with today’s technology, but these findings are important milestone in search of life beyond Earth.

Future

The Hubble Space Telescope is already being used to search for atmospheres around the planets. Emmanuel Jehin, a scientist who also worked on the research, asserts that “With the upcoming generation of telescopes, such as ESO’s European Extremely Large Telescope and the NASA/ESA/CSA James Webb Space Telescope, we will soon be able to search for water and perhaps even evidence of life on these worlds.”

This revelation is also important for researchers who are working to compare how worlds evolve. The TRAPPIST-1 system probably has a similar variety of worlds. “If one of these planets hosts life and the adjacent one doesn’t, why not?” asks Sarah Ballard, an astronomer at the Massachusetts Institute of Technology (MIT) in Cambridge.

Although at least some fraction of each planet could harbour liquid water, it doesn’t necessarily follow that they are habitable. TRAPPIST-1 emits about the same amount of X-ray and ultraviolet radiation as the Sun does, which would make it much more challenging for life to thrive.

 

Categories
Science and Technology

China launches world’s first quantum communications satellite

China launches world’s first quantum communications satellite, marks an achievement in the realm of higher technology, a milestone in space and cryptography. Quantum Experiments at Space Scale (QUESS), nicknamed Micius after the philosopher, lifted off from Jiuquan Satellite Launch Center at 1:40 AM local time (August 15, 2016 in the U.S.) and is currently maneuvering itself into a sun-synchronous orbit at 500 km.

QUESS is an experiment in the deployment of quantum cryptography — specifically, a prototype that will test whether it’s possible to perform this delicate science from space. Inside QUESS is a crystal that can be stimulated into producing two photons that are “entangled” at a subatomic, quantum level. Entangled photons have certain aspects — polarization, for example — that are the same for both regardless of distance — in fact, the satellite will test that at 1,200 km, which will set a new record. The trouble with this tech is that photons are rather finicky things, and tend to be bounced, absorbed, and otherwise interfered with when traveling through fibers, air, and so on. QUESS will test whether sending them through space is easier, and whether one of a pair of entangled photons can be successfully sent to the surface while the other remains aboard the satellite.

If this proves possible, the satellite will attempt quantum key distribution via these entangled photons. When measured, a photon will show its observers a random polarization state — but critically, entanglement means the other photon will always show the same random state. These correlated polarizations can be the basis of a cryptographic key known only to the observers.

The best thing about this is that apart from the original distribution of the photons, there is no transmission involved, or at least not one we understand and can intercept. Whatever links the two photons is intangible and undetectable — you can’t entangle a third one to listen in, and if even if you managed to interfere with the process, it would be immediately noticed by the observers of the original entangled photons, which would cease to be perfectly correlated. As you can imagine, an undetectable and perfectly secure channel for digital communications is of enormous potential value for an endless list of reasons. China is early to the game with QUESS, but they’re not the only ones playing. Other quantum satellites, though none quite so advanced, are in the ether right now, and more are sure to come. The experiments from the whole set will definitely be interesting — if anyone can find a way to explain what’s going on in them.

Quantum communication

Quantum communication is a field of applied quantum physics closely related to quantum information processing and quantum teleportation. It’s most interesting application is protecting information channels against eavesdropping by means of quantum cryptography. The most well known and developed application of quantum cryptography is quantum key distribution (QKD). QKD describes the use of quantum mechanical effects to perform cryptographic tasks or to break cryptographic systems. The principle of operation of a QKD system is quite straightforward: two parties (Alice and Bob) use single photons that are randomly polarized to states representing ones and zeroes to transmit a series of random number sequences that are used as keys in cryptographic communications. Both stations are linked together with a quantum channel and a classical channel. Alice generates a random stream of qubits that are sent over the quantum channel. Upon reception of the stream Bob and Alice — using the classical channel — perform classical operations to check if an eavesdroper has tried to extract information on the qubits stream. The presence of an eavesdropper is revealed by the imperfect correlation between the two lists of bits obtained after the transmission of qubits between the emitter and the receiver. One important component of virtually all proper encryption schemes is true randomnessm which can elegantly be generated by means of quantum optics.

Quantum communication is thus the art of transferring a quantum state from one place to another. Traditionally, the sender is named Alice and the receiver Bob. Quantum information science is an area of study based on the idea that information science depends on quantum effects in physics. It includes theoretical issues in computational models as well as more experimental topics in quantum physics including what can and cannot be done with quantum information. The term quantum information theory is sometimes used, but it fails to encompass experimental research in the area.

Subfields include:

  • Quantum computing, which deals on the one hand with the question how and whether one can build a quantum computer and on the other hand, algorithms that harness its power.
  • Quantum complexity theory
  • Quantum cryptography and its generalization, quantum communication
  • Quantum error correction
  • Quantum communication complexity
  • Quantum entanglement, as seen from an information-theoretic point of view
  • Quantum dense coding

Quantum networks form an important element of quantum computing and quantum cryptography systems. Quantum networks allow for the transportation of quantum information between physically separate quantum systems. In distributed quantum computing, network nodes within the network can process information by serving as quantum logic gates. Secure communication can be implemented using quantum networks through quantum key distribution algorithms.

Optical quantum networks using fiber optic links or free-space links play an important role transmitting quantum states in the form of photons across large distances. Optical cavities can be used to trap single atoms and can serve as storage and processing nodes in these networks.

Quantum teleportation is a well-known quantum information processing operation, which can be used to move any arbitrary quantum state from one particle (at one location) to another.

In quantum information theory, a quantum channel is a communication channel which can transmit quantum information, as well as classical information. An example of quantum information is the state of a qubit. An example of classical information is a text document transmitted over the Internet.

More formally, quantum channels are completely positive (CP) trace-preserving maps between spaces of operators. In other words, a quantum channel is just a quantum operation viewed not merely as the reduced dynamics of a system but as a pipeline intended to carry quantum information.