top of page
Search

Artificial Intelligence A to Z: Building Your Essential AI Vocabulary

  • Writer: Allison Higgins
    Allison Higgins
  • Sep 22
  • 7 min read

“AI will not replace humans, but those who use AI will replace those who don’t.” – Ginni Rometty, Former CEO of IBM


ree

Why It Matters


Demonstrating the ability to apply artificial intelligence to problem solving is quickly becoming an interview "must"for fields in, and adjacent to technology. Artificial intelligence has impacted every sector of the economy. Familiarizing yourself with these terms, regardless of your background, will automatically add value to your skillset. Understanding and using them correctly in your everyday work, can also open doors to higher paying and essential jobs in artificial intelligence economy




Where To Use These Terms


Core Artificial Intelligence Roles

  • Machine Learning Engineer

  • Artificial Intelligence Engineers

  • Robotics Engineer

  • Data Science and Analytics


Artificial Intelligence Adjacent Roles

  • Technical Sales

  • Workflow Development Specialist

  • Compliance Specialists

  • Cybersecurity


Roles Most Likely to be Replaced by Artificial Intelligence

  • Manufacturing

  • Data Entry and Visualization

  • Retail/Commerce

  • Clerical - Accounting, Bookkeeping, and Tax preparation



Where to Learn Artificial Intelligence



Who to Know in AI




A


















B




  • BERT (Bidirectional Encoder Representation from Transformers) - this model digs deep into sentences, picking up on context from both directions—left to right and right to left. BERT learns bi-directional representations of text to significantly improve contextual understanding of unlabeled text across many different tasks.




C








  • Computer Vision - Computer vision is an interdisciplinary field of science and technology that focuses on how computers can gain understanding from images and videos. For AI engineers, computer vision allows them to automate activities that the human visual system typically performs.





  • Context Window - The maximum number of tokens (words or parts of words) that an AI model can process and consider simultaneously when generating a response. It is essentially the “memory” capacity of the model during an interaction or task. Models with larger context windows can handle larger attachments/prompts/inputs and sustain “memory” of a conversation for longer (Fogarty, 2023).







D


  • Data Science - An interdisciplinary field of technology that uses algorithms and processes to gather and analyze large amounts of data to uncover patterns and insights that inform business decisions.



  • Deep Learning - A machine learning technique that layers algorithms and computing units—or neurons —into what is called an artificial neural network (ANN). Unlike machine learning, deep learning algorithms can improve incorrect outcomes through repetition without human intervention. These deep neural networks take inspiration from the structure of the human brain.







E


  • Emergent Behavior (Emergence) - when an AI system shows unpredictable or unintended capabilities that only occur when individual parts interact as a wider whole.





F




  • Fine Tuned Model - A machine learning technique that adapts a pre-trained model to perform better on your specific task. Instead of training a model from scratch, you start with a model that already understands general patterns and adjust it to work with your data.




G


  • Generative Artificial Intelligence (Generative AI/ Gen AI) - a type of technology that uses AI to create content, including text, video, code and images. A generative AI system is trained using large amounts of data, so that it can find patterns for generating new content.







H








I





L


  • Large Language Models (LLM) - a category of deep learning models trained on immense amounts of data, making them capable of understanding and generating natural language and other types of content to perform a wide range of tasks. LLMs are built on a type of neural network architecture called a transformer which excels at handling sequences of words and capturing patterns in text.




M


  • Machine Learning - a subset of AI in which algorithms mimic human learning while processing data. This field focuses on developing algorithms and models that help machines learn from data and predict trends and behaviors, without human assistance.












N


  • Natural Language Processing (NLP) - a type of AI that enables computers to understand spoken and written human language. NLP enables features like text and speech recognition on devices.



  • Neural Network - A neural network is a deep learning technique designed to resemble the structure of the human brain. It requires large data sets to perform calculations and create outputs, which enables features like speech and vision recognition.




O






P


  • Pattern Recognition - Pattern recognition is the method of using computer algorithms to analyze, detect, and label regularities in data. This informs how the data gets classified into different categories.



  • Predictive Analytics - Predictive analytics is a type of analytics that uses technology to predict what will happen in a specific time frame based on historical data and patterns.









Q


  • Quantum Computing - is built on quantum bits, or qubits, which can store both zeros and ones. Qubits can represent any combination of both zero and one simultaneously; this is called superposition, and it is a basic feature of any quantum state. When a qubit’s subatomic particles are in a superposition state, each subatomic particle can interact with and influence others, a phenomenon called quantum interference. Quantum chips make up the physical hardware that stores qubits, similar to microchips in classical computers.




R







S





  • Supervised Learning - a type of machine learning that learns from labeled historical input and output data. It’s “supervised” because you are feeding it labeled information.





T





  • Turing Test - The Turing test was created by computer scientist Alan Turing to evaluate a machine’s ability to exhibit intelligence equal to humans, especially in language and behavior. When facilitating the test, a human evaluator judges conversations between a human and machine. If the evaluator cannot distinguish between responses, then the machine passes the Turing test.






U


  • Unsupervised Learning - Unsupervised learning is a machine learning type that looks for data patterns. Unlike supervised learning, unsupervised learning doesn’t learn from labeled data. This type of machine learning is often used to develop predictive models and to create clusters.




V






Z


 
 
 

Comments


© 2025 by Kolor Koded Enterprises, LLC. Powered and secured by Wix

bottom of page