BLOG

Artificial Intelligence For Information Technology

post-thumbnail
November 02, 2021
Joshua Powers

Artificial intelligence (AI) and Machine Learning (ML) are currently enjoying a robust hype cycle across almost every industry and government function. The global AI market is expected to hit $300 billion in the next 5 years. Anyone who has used any kind of consumer technology in the past 5 years has experienced AI-driven services while also contributing training data for the next generation of predictive models. A major reason for the ability of AI to have such impact in its current form is that the power and availability of information technologies (IT) are, for the first time in history, adequate to the task of training and deploying rich, accurate AI/ML models. So if the scale of capabilities like cloud computing and serverless architecture are enabling AI solutions, what can AI do to enhance the development and delivery of these and other information technology services?

Cloud Computing

The pay-as-you-go cloud model for compute and storage has improved cost efficiency tremendously for most IT shops. However, because they are so flexible, cloud deployments can become overwhelmingly complex with event-driven architectures, containerization and serverless operations all creating far more surface area to monitor and evaluate. For most medium-to-large organizations, these complexities are multiplied by at least 3, as multi-cloud operation across AWS, Google and Azure becomes a standard posture.

These hyperscale providers all regularly use machine learning to optimize the scaling and delivery of their cloud services to customers. Some AI companies are now beginning to offer tools that use cloud customers’ data to predict the cheapest cloud products among vendors that will deliver the required service levels, and autonomously configure and migrate to those products. Meanwhile, companies with internal data science teams can take advantage of cloud-postured log information to build predictive usage models that streamline their use of cloud resources.

Data Center Operations

In 2018, Microsoft sank 860 servers and 27 petabytes of storage off the Orkney Islands in Scotland and tethered them to power and fiber optic cables. They operated this truly ‘lights out’ data center for two years, observing that it outperformed a mirrored land-based data center in terms of reliability and uptime. Positioning IT hardware underwater near wind turbines and fiber trunks is going to require a different level of autonomous monitoring and self-healing than traditional data centers have thus far implemented. In less radical applications, technologies such as robotic process automation (RPA) are being employed to reduce the repetitive drudgery of many IT tasks and allow human workers to focus on decisions at a higher strategic level.

Cybersecurity

In 2016, the Defense Advanced Research Projects Agency (DARPA) hosted a Cyber Grand Challengethat pitted 7 Cyber Reasoning Systems (CRS) against each other in a battle of simulated cyber attacks and responses. These systems autonomously found vulnerabilities, developed exploits and created patches in cycles measured in seconds, far faster than any human cyber operators could keep up with. Cyber adversaries, both state actor and private, are already using data science and AI/ML technologies to develop their arsenals. White hat cyber practitioners will have to become conversant with these advanced technologies in order to keep up.

These are just a handful of examples of how AI changes the work that Information Technology professionals manage

Looking Forward

In the first 50 years of AI research advances in information technology continually drove the art of the possible. The first artificial neuron was described in 1943, but the maximum practical size of a neural network at the turn of the century was still measured in thousands of weights between neurons. Today, the GPT-3 natural language network has 175 billion weights, more than the brain of a frog. The narrative has changed, and now the potential of information technologies depends on the ability of AI and ML to properly configure, deploy, monitor, and protect them in real time and at scale.

For professionals in the Information Technology field, the opportunity to leverage AI to accelerate and automate mundane tasks to focus on high value work is increasing daily. Tomorrow’s successful technologists will know how to embrace and enhance their skillsets with these important skills. You can hear more about this intersection of AI and Information Technology at the Northern Virginia Community College IET Speaker Series event on November 4, 2021 at 6:30. Registration is free to the public:ACTIVE | NOVA SySTEMic.

Joshua Powers

Technical Director of AI & Machine Learning

Dev Technology