Information technology covers a vast amount. The dictionary definition of information technology is that it involves "the development, maintenance, and use of computer systems, software, and networks for the processing and distribution of data." In the 21st century, computers are built into millions of other machines, but the definition of IT excludes some types of digital technology.
Information technology includes computers, software and networks that process and share data.
History of the Term IT
The first use of the term "information technology" dates all the way back to 1978, just a few years after Bill Gates founded Microsoft. It's around this point that computers were becoming more than just incredibly powerful adding machines that made complex calculations. Once they could index and sort information, they became capable of much more.
The Harvard Business Review coined "information technology" to distinguish programmable computers from computerized tech built for a specific purpose. Laptops, tablets and digital phones are programmable devices that can carry out a wide variety of tasks. They're all part of the IT world. Televisions are not, even though modern TVs are computerized. A digital fire alarm wouldn't qualify as IT either: It's built for one specific mission and nothing else.
Computers' ability to process and distribute information is a key part of the IT definition. The sizable computer-manufacturing industry is essential to IT, but it isn't usually counted as an information technology industry any more than manufacturing a fiber-optic cable or a lap desk is.
Natural Language Created IT
Computers have been around a lot longer than IT. There are 19th-century devices, such as the Jacquard loom, that could be programmed to carry out various mechanical tasks. They might qualify as computers, but not as information technology.
What made the leap from computing to true IT was the development of natural language programming. Early programming was heavily mathematical, making use of numerical coding. The only way to direct the computer was through numbers; mathematicians and engineers were the primary programmers.
Things changed after the birth of the compiler. These programs, also called interpreters, became the basis of modern operating systems because they could turn programs written in assembly languages into the code language the computer spoke. This was the first step in the long journey to being able to make a computer save or delete with just a simple keyboard command.
Types of Information Technology
The definition of information technology covers a lot of ground. It includes, for example, physical hardware, operating systems, apps, peripherals and virtual tools. These include:
- Computer applications for input, processing and outputting data.
- Server hardware and software supporting databases, data storage, email and printers.
- Digital voice, video and data networks and all the related communications equipment and software.
- Digital telephone equipment and networks.
- Word processing and spreadsheet software.
- Personal computers.
IT also refers to the architectures, methods and regulations governing how companies, networks and individuals store and use data. For example it includes virtualization and cloud computing, where data and software are stored and pooled remotely. Cloud services may be distributed across locations and shared with other IT users, or contained within a corporate data center – or some combination of both deployments. It's nothing anyone imagined back when the words "information technology" were first used.
What Are Information Technology Jobs?
An industry as big as IT employs millions of people in a wide variety of jobs. There's a huge demand for the kind of specialists who can keep the IT world running. Fortunately, working in IT no longer requires a mathematical background. People with a wide variety of experience enter and thrive in the industry.
The information technology field offers many different career paths. IT workers can specialize in fields like software development, application management, desktop support, server administrator, storage admin and network architecture. A given organization's staff might include, for example:
- The chief information officer who oversees the organization's IT and computer systems.
- The chief technology officer who sets tech goals and policies.
- The IT director or IT manager responsible for keeping all the company's tech, tools and processes running.
- A systems administrator who configures, manages, supports and troubleshoots a multi-user computing environment. Some organizations have different sys-admins for server, desktop, network and other parts of the system.
- An applications manager who oversees a high-value business app.
- Developers who write, update and test code.
- An architect who examines and changes IT functions to support the organization.
Some IT professionals find success by focusing on a particular IT function, such as databases, services or security, and specializing. Other pros combine IT with another skill, such as graphic design.
The Industry Keeps Growing
The IT industry is steadily growing. At the time of writing, annual spending on IT is projected to hit $4.8 trillion, and $1.5 trillion of that is spent in the U.S. About 5.4 million people work in the industry in the United States.
The U.S. is the world's largest tech market, though the Asia-Pacific market that includes China, Japan and Australia is almost its equal. Most spending comes from corporations and governments, rather than individuals or small home-based businesses.
A big challenge for the information technology industry is that the need for skilled IT workers keeps outstripping the supply. As it grows, IT is grappling with how to recruit, train and manage talent, how to maintain a diverse workforce and what makes a desirable career path. Several factors make the landscape for hiring new talent tougher than it used to be:
- IT firms are competing against each other to hire the best new talent.
- The difficulty of finding enough workers with expertise in new technologies and skills such as AI development, penetration testing and drone operations.
- Rising salary demands.
- Finding workers with the right soft skills. IT workers have to interact with teams, project managers and other departments, so tech ability by itself isn't enough. They need people skills alongside their tech know-how.
- Even if there are plenty of workers in the U.S. to fill a particular slot, a computer company in a given area may not be able to find local workers to do the job. Depending on the locale, it may not be easy to convince new recruits to move there.
At the time of writing, 53 percent of the world's IT spending goes to traditional categories: hardware, software and services. Telecom services account for 30 percent; 17 percent is spent on emerging technologies that don't fit neatly into one category or combine multiple categories.
In part because the U.S. has a well-developed IT infrastructure, almost half of American spending goes toward software and hardware. In other countries, basic hardware and telecommunications eat up more of the spending. However, countries that don't have a big IT infrastructure have the option to leapfrog ahead and develop newer technologies. Countries that start out with cell phones don't have to build land-line networks, for instance.
IT Changes Constantly
No industry has changed as much as IT over the past few decades. Even professionals in the industry can find themselves falling behind as new technologies make their specialty obsolete. Currently, several trends look to affect the future of the industry and the people who use it. These include:
- Cloud computing, in which companies store data on remote servers rather than personal laptops or desktops, allows them to pay for just the storage they need rather than buying more hardware than necessary.
- Cognitive computing and artificial intelligence allow computers to perform more complex, difficult tasks that used to rely on human judgment.
- User-friendly apps and tools enable growing numbers of customers to use computers without having to write code or understand how the technology they're using actually works.
- Cybersecurity is becoming an increasing problem. Hackers are becoming increasingly skilled at exploiting security weaknesses to steal everything from personal data to tax refunds.
- The ability to influence Americans with propaganda has hit new levels as foreign nations use IT to spread false rumors on social media.
The Impact of IT
The changes wrought by IT spread far beyond the information technology industry itself. IT plays a role in almost every industry, shaping and reshaping them in different ways.
- By making marketing more efficient and information more accessible, IT improves management performance in small and medium-size companies. IT can improve communication, financial planning, staffing and organization.
- The construction industry uses information technology in various ways, for example relying on cloud computing for financial management.
- Information technology in health care enables doctors and hospitals to keep better records and manage patient data more effectively.
- Brick-and-mortar retailers still use human salesclerks, but they use IT in many other ways. IT helps companies keep track of customer data and manage inventory, so that they neither run out of stock nor stock more items than they need.
- Multiple car companies are working to develop a reliable driverless car.
- The Internet of Things incorporates IT into ordinary household technology. Anything with an off switch, from a floor lamp to a coffee maker, can be controlled remotely via the Internet. In theory, your Internet-connected alarm clock could start your coffee maker as soon as it wakes you up.
The advantages IT brings come at a price. The Internet of Things makes data and networks increasingly vulnerable, for instance. Companies that use IT need to hire a dedicated staffer or subcontract with an outside company. Businesses place valuable data online without increasing their security to match. Companies don't always integrate data so that different software packages have access to the same information.
Most organizations using IT favor a policy of gradual change, adding or upgrading their equipment piecemeal. Few companies plan carefully for the increasingly digital future. Small businesses often question whether investing in IT gives a good return on their money: Upfront costs can be steep, maintenance and upgrade costs are often expensive and some products are still too complex to use easily. Small companies that can't dedicate resources to an IT department have to settle for just getting by.
IT and New Businesses
A start-up company may have an edge over established firms when it comes to using IT. Instead of having to rework and transform the established way of doing things, a start-up can incorporate IT into its operations from the first. It doesn't hurt that IT is becoming increasingly easy to use, even for non-IT professionals, and that prices keep dropping.
The first step to incorporating IT is to list all its potential uses – producing invoices, emailing vendors, bookkeeping, inventory control, managing marketing and maintaining staff records. Entrepreneurs should think about the sort of computing devices they want. If they're constantly on the road, a smartphone and a tablet might make more sense than a desktop computer model. It's also important to think about sharing data. If a desktop calendar app can't share appointments with the user's phone, that could be inconvenient. While a business probably can't anticipate every possible IT need it will have, it should plan for tech tools that can network together and trade data back and forth.