Knowledge

Notitia's A - Z of technical concepts

If you've arrived here because you need the layman's guide to "techy" terminology, bookmark this now! We'll keep adding to this list to make it the most comprehensive one around.

February 7, 2023

Notitia team picture

A - Z of technical concepts | data | technology | analytics | web development | digital design

Agile

An iterative approach to project management and software development that helps teams to deliver value to customers faster. Agile teams deliver work in small (but consumable) increments. Change is responded to quickly

Alerts and monitoring systems (for data workflows)

Alerts and monitoring systems for data workflows are tools that provide real-time tracking and notifications about the performance and health of data processes. They help detect anomalies, errors, or deviations in the workflow, allowing prompt identification and resolution of issues to maintain the reliability and integrity of the data pipeline.

Artefacts

Artefacts is information that is used to detail the product being developed, actions to produce it and the actions performed during the project. In software development, the term artefact refers to key information needed during the development of a project.

BI Analytics

Business Intelligence (BI) Analytics refers to the process of using various tools, techniques, and technologies to analyse and interpret data, in order to make informed business decisions. It involves collecting, integrating, and analysing large volumes of data (from multiple sources) to collect insights that can help drive business performance and competitiveness.

BI Analytics involves the use of various data visualization tools, dashboards, and reports to present data in a way that is easy to understand and can be used to make informed business decisions. The insights gained from BI Analytics can be used to improve operational efficiency, identify new business opportunities, optimise customer experience, and gain a competitive edge in the market.

Overall, BI Analytics provide businesses with a comprehensive view of their operations, enabling them to make data-driven decisions that can help them achieve their business objectives.

Big Data

Big data is a term used to describe extremely large datasets that are too complex and voluminous for traditional data processing methods to handle. It refers to the massive volume of structured, semi-structured, and unstructured data that is generated by various sources such as social media, sensors, web applications, and other digital platforms.

The concept of big data is often associated with the three Vs: Volume, Velocity, and Variety. Volume refers to the enormous amount of data generated every day, while velocity refers to the speed at which this data is generated and processed. Variety describes the diverse types of data that can be generated, such as text, images, videos, and audio.

Big data technologies, such as Hadoop, Spark, and NoSQL databases, have emerged to help organisations to manage, store, and analyse this data, and to extract valuable insights from it.

The insights derived from big data can be used to improve business decision-making, optimise processes, and identify new opportunities for growth.

Core Software Stack

A core software stack refers to the foundational set of essential technologies, frameworks, and programming languages that form the basis of a software application or system. It typically includes components such as the operating system, database management system, programming language, and other fundamental tools required for developing and running the software.

Dashboard

A visual display of your data. It pulls together a comprehensive overview of data from different sources. Dashboards are useful for monitoring, measuring, and analysing relevant data in key areas.

Data

Information, especially facts or numbers, collected to be examined and considered and used to help decision-making, or information in an electronic form that can be stored and used.

Data Analytics

Data analytics is the management and analysis of data to drive business processes and improve business outcomes through more effective decision making and enhanced customer experiences.

Database

A database (or electronic database) is any collection of data that is organised for rapid search and retrieval. Databases are structured to facilitate the storage, retrieval, modification, and deletion of data in conjunction with various data-processing operations. A database management system (DBMS) extracts information from the database in response to queries.

Data Dictionary

A data dictionary provides detailed information about the structure, contents, and relationships between the data elements in a database or other data repository. It typically includes a list of all the data elements or fields in the database, along with their data types, descriptions, and any constraints or business rules that apply to them.

It may also include information about the source of the data, the data format, and any transformations or calculations that have been applied to the data. It can be used as a reference guide for database administrators, programmers, analysts, and other stakeholders who need to understand and work with the data in the repository.

In short, a data dictionary serves as a metadata repository that helps to ensure consistency and accuracy in data management.

Data Evaluation Script

A data evaluation script is a set of instructions written in a programming language that analyzes and assesses data based on predefined criteria or algorithms. It typically processes and interprets data to derive meaningful insights, identify patterns, or make informed decisions within a given context.

Data Governance

Data governance are the processes, roles, policies, standards, and metrics that ensure the effective and efficient use of data. It establishes the processes and responsibilities that ensure the quality and security of the data used across a business. Data governance defines who can take what action, upon what data, in what situations, using what methods.

Data Lake

A data lake is a centralised repository designed to store, process, and secure large amounts of structured, semistructured, and unstructured data. It can store data in its native format and process any variety of it, ignoring size limits.

Data Pipelines

Data pipelines are a set of processes that extract, transform, and load (ETL) data from various sources to a destination, ensuring a smooth and organised flow of information. They enable efficient data management, analysis, and storage, facilitating the integration of disparate data into a unified and usable format.

Data Pipeline Debugging

Data pipeline debugging involves identifying, analysing, and resolving issues or errors within the data pipeline infrastructure to ensure the smooth and accurate flow of data from source to destination. It often includes tracking and troubleshooting data inconsistencies, errors in transformation processes, and addressing any bottlenecks or disruptions that may impact the reliability and efficiency of the data pipeline.

Data Point

A data point is a single unit of information or observation within a dataset, representing a specific value or measurement related to a particular variable. In statistical analysis, data points serve as the building blocks for generating insights, trends, and patterns by collectively forming the dataset.

Data Quality

Data quality is defined as: the degree to which data meets a company's expectations of accuracy, validity, completeness, and consistency.

Data Strategy

A data strategy is a long-term plan that defines the technology, processes, people, and rules required to manage an organisation's information assets.

Data Source

A data source is a location or system from which data is collected or retrieved. It can be a database, file, sensor, application, or any platform that generates or stores information, serving as the origin point for data acquisition and analysis.

Data Visualisation

Data visualisation is the process of representing data and information in a graphical or pictorial format. It involves creating visual representations of data to help people understand complex information and to make it easier to identify patterns, trends, and relationships.

Data visualisation tools can be used to create charts, graphs, maps, and other visual representations of data. These tools allow people to explore data and to see how different data points relate to each other. By visualising data, people can better understand complex information and make more informed decisions.

Data visualisation is used in a variety of industries, including business, finance, healthcare, and science. It is a key component of data analytics and is an important tool for communicating insights and findings to stakeholders.

Data Warehouse

A centralised repository that collects, integrates, and stores large volumes of structured data from various sources within an organisation.

Data Workflow

A data workflow refers to the end-to-end sequence of steps and processes involved in handling, processing, and analysing data from its initial collection or acquisition to its final output or decision-making. It encompasses tasks like data ingestion, transformation, analysis, and visualisation, providing a structured framework for managing and extracting insights from data throughout its lifecycle.

Developer (or Web Developer)

Developers, web developers, and software engineers focus on developing applications, features, and functionality for end-users.

Emailing or Messaging API Service

An emailing or messaging API service is a software interface that allows developers to integrate email or messaging functionality into their applications, websites, or systems. It enables automated sending, receiving, and management of emails or messages, providing a seamless communication layer for applications without the need for developers to build the entire messaging infrastructure from scratch.

Fact Table

A fact table is a central table in a data warehouse that stores quantitative data about a business process or activity. It typically contains measurements, metrics, and keys that connect it to dimension tables for comprehensive analysis in data analytics.

Feature Driven Development (FDD)

Methodology within an agile framework that organises software development around making progress on features (user stories).

Minimum Viable Product

A version of a product with just enough features to be usable by early customers who can then provide feedback for future development. A focus on releasing an MVP means that developers avoid unnecessary work. Instead they iterate on working versions and respond to feedback, challenging and validating assumptions about a product’s requirements.

Mood board

A mood board is a visual arrangement of images, materials, text, and other design elements to portray the final design's style. Mood boards can be used for creating brand designs, product designs & other design projects.

Modern Cloud Technologies

Modern cloud technologies refer to a set of tools, platforms, and infrastructure that enable the delivery of on-demand computing resources over the internet. These technologies are designed to provide businesses and organisations with scalable and flexible IT infrastructure, as well as the ability to rapidly develop, deploy, and manage applications and services.

Some examples of modern cloud technologies include:

  1. Infrastructure as a Service (IaaS): This technology provides virtualised computing resources, such as virtual machines, storage, and networking, over the internet.
  2. Platform as a Service (PaaS): This technology provides a platform for developers to build, test, and deploy applications without having to worry about the underlying infrastructure.
  3. Software as a Service (SaaS): This technology provides access to software applications over the internet, typically through a web browser.
  4. Serverless computing: This technology allows developers to write and run code without having to manage the underlying infrastructure. The cloud provider automatically scales the computing resources up and down based on the application's needs.
  5. Containers and container orchestration: This technology provides a lightweight, portable way to package and deploy applications. Container orchestration tools, such as Kubernetes, help manage and scale containerised applications.
  6. Hybrid cloud: This technology allows businesses to use a combination of public cloud and private cloud infrastructure to achieve greater flexibility, scalability, and cost savings.

Overall, modern cloud technologies enable businesses and organisations to build and deploy applications and services more quickly and efficiently, while also reducing the need for costly on-premises infrastructure.

Scheduled Log

A scheduled log is a record of events, activities, or system information that is systematically captured and stored at predetermined intervals or specific times. This structured logging approach allows for the organized tracking of changes, performance metrics, or other relevant data on a regular and scheduled basis.

Transformed Data

Transformed data refers to data that has been altered or modified from its original form to make it more useful or informative for analysis or modeling. Data transformation involves applying a set of mathematical or statistical operations to the data, which can include scaling, normalisation, standardisation, encoding, imputation, aggregation, or feature extraction.

Transformed data can be useful in many different contexts, such as data preprocessing for machine learning, exploratory data analysis, data visualisation, or data cleaning. By transforming data, it is often possible to reveal patterns, relationships, or trends that may not be apparent in the raw data, and to make the data more amenable to statistical analysis or modeling.

UI/UX Design

In the context of data analytics, digital and graphic designers work with the client to ensure their dashboard is visually appealing, inline with the business brand, while also being simple to use & functional. They need to understand the requirements of the developer, data analyst and the client.

UI/UX design refers to the process of designing digital products such as websites, mobile applications, software programs, and other user interfaces with a focus on user experience (UX) and user interface (UI) design.

UX design is concerned with creating a positive and seamless experience for the user by understanding their needs and behaviors and designing the product accordingly. This involves conducting user research, creating user personas, wireframing, prototyping, and testing to ensure that the final product is usable and effective for the user.

UI design, on the other hand, focuses on the visual and interactive aspects of the product, such as the layout, typography, color scheme, and graphic design. The aim is to create an aesthetically pleasing and intuitive interface that allows the user to interact with the product in a simple and effective manner.

Overall, UI/UX design aims to create a product that is not only visually appealing but also user-friendly and meets the needs of the target audience.

Meet our in-house design team.

Vendors (Analytics Software Vendors)

Data analytics software vendors are companies that develop and sell software solutions designed to help organisations analyse large volumes of data to extract insights and make data-driven decisions.

These software solutions typically include features such as data visualisation, data mining, predictive analytics, and machine learning algorithms that allow users to identify patterns and trends within their data and gain valuable insights into their business operations.

There are many different types of data analytics software vendors, ranging from large enterprise software companies to smaller niche vendors that specialise in specific industries or applications.

Notitia's list of partners.

Wireframe

A wireframe is the visual representation of a website or dashboard which shows where the elements are positioned & how they interact with each other. It allows the user to understand the visitor’s journey to achieve certain actions.

Notitia's Data Quality Cake recipe