Embedded Analytics
Glossary

Definitions from A – Z

A

Actionable Analytics

Actionable analytics are processed data sets that prescribe any direct, meaningful actions you can take based on the analysis of raw data.

For example, let’s say you’re looking through survey responses and see that you have a large drop-off point after the fifth question. The actionable insight would be to change the design of the survey and changing the question.

It wasn’t a hunch or something anecdotal that led you to that choice. You came to it after reviewing the information at hand and there were hard numbers you could look at. On the other side of the coin is data that you already know, or that you can’t find any sort of pattern for.

Actionable Reporting >

Active Directory

Active Directory (AD) is a database and set of services connecting users with network resources. It helps to organize an organization’s hierarchy and contains information about what users and computers there are and who’s allowed to do what. A database of user accounts, job title, phone number, passwords and permissions can be an example of Active Directory.

Active Directory services control much of what goes on in your IT environment making sure each person is authenticated by checking the user ID and password and allow access only to data they’re allowed to use and simplifies life of end users and administrators while elevating security for organizations.

Ad Hoc Reporting

Ad hoc is Latin for “when necessary or needed”. That means Ad Hoc reports are generated as needed to answer business questions. Ad Hoc reporting is a one-time-use report and is a model of business intelligence (BI) in which reports are made by nontechnical users.

Ad hoc reporting is different from structured reporting in many ways. Structured reports use a large volume of data and are produced using a formalized reporting template. Ad hoc reports rely on smaller amounts of data making it easier to report on a specific data point answering a specific business question.

API

API or Application Programming Interface is a software that connects computers or programs. In other words, API allows two applications to talk to each other. Each time you use a google map or check the weather on your phone, you are using an API.

Artificial Intelligence (AI)

Artificial intelligence is intelligence demonstrated by machines rather than natural intelligence displayed by humans or animals.AI simulates human intelligence processed by computer systems.

Artificial intelligence (AI) applications include expert systems, natural language processing (NLP), face recognition and machine vision to accomplish specific tasks by processing large amounts of data and recognizing patterns in the data.

Artificial Intelligence makes it possible for machines to learn from sample data or experience and adjust to new data and perform tasks like humans. Self-driving cars are an example of using AI that heavily rely on deep learning and NLP.

Automated Alerts

When an error condition or an event occurs, an automated message sent by email, text, etc. is called an Automated Business Alert which requires an action. For example, a bakery manager can be automatically informed when the stock level of pastry flour is below or above a certain level.

Custom alerts can be sent to certain users at the right time for appropriate action. In our bakery example, the manager will receive an alert when the level of flour is critically low to purchase more or if the traffic and sales increase, a notification can be sent to the manager to add more staff.

Autoscaling

Autoscaling (auto-scaling, auto scaling or automatic scaling) is a cloud computing method to dynamically and automatically adjust computational resources within a server farm based on the load, typically measured by the number of active servers.

Autoscaling is useful when a site or application needs additional server resources to satisfy additional page requests or processing jobs. Autoscaling is typically used to handle sudden traffic bursts or spikes but is also useful when designing a scalable architecture to manage longer term resource growth.

AWS Native for Cloud-Native Development

In the broader context of cloud-native development, “AWS Native” can refer to the overall approach of building and deploying applications on AWS that leverage the benefits of the cloud platform. This involves using managed services, serverless functions, and other cloud-native features to create flexible, scalable, and resilient applications.

Here are some key principles of AWS Native for cloud-native development:

  • Focus on managed services: Utilize pre-built and maintained services by AWS instead of managing your own infrastructure.
  • Adopt serverless architectures: Leverage serverless functions for event-driven processing and avoid managing servers directly.
  • Embrace microservices: Break down applications into smaller, independent services for increased agility and scalability.
  • Use automation tools: Automate infrastructure provisioning, deployment, and scaling for faster delivery and reduced operational overhead.

B

Big Data

Data is considered ‘big’ when it exceeds your operational database capacity. The four main characteristics of big data are:

  • Volume, or size of data (e.g., stock trading data for a year)
  • Velocity, or how quickly data arrives (worldwide customer purchasing data over time)
  • Variety, or forms the data takes (e.g., user account info, product specifications, etc.)
  • Veracity, or the uncertainty of data (e.g., reviews from real people v. bots).

The four V’s will help determine when to migrate from regular data to big data solutions.

Business Intelligence Software >

C

Cloud Native

Cloud-native originates in cloud and is an approach in software development that builds and runs scalable applications in public or private clouds such as AWS and AZURE. Cloud-native consists of continuous integration, orchestrators, and container engines. These cloud technologies allow for accessibility and scalability and allow developers to continue to deliver new services more quickly and easily. At the end of the day, it’s about how applications are created and deployed.

Compliance

Compliance analytics is analyzing company data (and sometimes data it does not hold) using algorithms to search for patterns and anomalies to uncover policy violations, misconduct and fraud.

Organizations use sophisticated analytics to head off noncompliance and fraud to avoid damages and the attention of regulators.

Get a demo of Qrvey

D

Dark Data

Dark data is defined as the data you’re storing, but not analyzing. Typically this data is being held for compliance reasons or security policies that sits around doing nothing but taking up space. It’s not data that fits easily into relational (e.g. MySQL) databases so traditional BI vendors cannot help you do anything with it. So it sits there, collecting metaphorical dust. Meanwhile you may be sitting on missed opportunities to save cost, find efficiencies or provide better customer service.

Every company strives to make the best decisions possible and having all of the relevant data is a crucial element to achieve that goal. As our world moves increasingly to the cloud and becomes vastly more interconnected, the volume, variety and velocity of the data being generated creates a growing mountain of dark data that cannot be analyzed by traditional means.

Why is Analyzing Dark Data Important?

If your company is making decisions based on partial, incomplete or inaccurate data, you’re putting your entire organization at risk. You need an analytics platform that was architected for today’s decision-making requirements. One that can ingest all types of data no matter what its structure or where it lives.

More than 70% of all business data is never used for analysis because most traditional analytics tools only work with structured data.

One that has the power and performance to give you the answers you need, when you need them. And one that can harness the power of AWS to take action and bring the insights to you using the latest in automation and machine learning. Including all of your data into your decision making is no longer optional, it’s essential.

With traditional analytics, nearly 80% of the time spent on projects is spent on labor-intensive (and often soul-crushing) data preparation. But with Qrvey, data prep and transformation can be streamlined and automated so you can focus on insights, not input.

The world has moved beyond structured data, isn’t it time your analytics moved beyond just visualizations?

Learn more about Qrvey’s data management capabilities for SaaS applications

Database >

Dashboards

dashboard presents multiple data sources into a single interface to access all your most important data and metrics at a glance. They allow a unified view of data and information that matters to a business. Common business dashboard metrics are charts and graphs, key performance indicators (KPI’s), GIS maps, news, RSS feeds and stock quotes.

Data Automation

Data automation is the process of updating data programmatically (Extract, Transform, and Load [ETL]), rather than manually. Data Automation uses intelligent processes and systems to collect, process or store large quantities of data. Learn more about Qrvey’s embedded automation features.

Data Collection

Any process that captures any type of data is called Data Collection. The goal is to capture data for analysis leading to answers to posed questions.

Data Connectors

The process of moving data from one database to another is called a data connector. Data connectors may filter and transform data for query and analysis.

Data Discovery

Data discovery, a Business Intelligence term, is the process of collecting data and consolidating for evaluation and search of items or patterns. Data Discovery helps to filter noise out of the data to uncover insights from information. Insights are delivered to employees to help them make informed decisions day to day work.

Data Enrichment

Data Enrichment is the process or method of combining and preparing data from internal and external sources for analysis and exploitation. Data enrichment tools blend the data and present it in the right format for analysis allowing users to get answers quickly without IT assistance.

Data Extraction

Data extraction is the process of retrieving data from a database or SaaS platform for online analytical processing (OLAP) or data storage. Data extraction is the first step in the process known as ETL — extract, transform, and load.

Data Governance

Data governance is a set of processes, roles, policies, standards, and metrics for the efficient effective use of information. It outlines policies on access to data, who can take action, methods used, when and in what situations. Data governance enables organizations to have control over the processes that handle data. Some items covered by data governance include availability, usability, accuracy, integrity, consistency, completeness, and security.

Data Ingestion

Data ingestion is the process of moving data from a variety of sources to a data warehouse, data mart, database, or a document store to be accessed, used, and analyzed.

Data Lake >

Data Mart >

Data Management

The practice of securely collecting, storing, and using data in an efficient and cost-effective manner using a wide range of tasks, policies, and procedures.

Data Mart

A data mart is a structure or access layer to a data warehouse, to retrieve client-facing data for defined business sub-group such as a line of business or team. Data marts make data available to these groups allowing them to access critical insights without searching the full data warehouse. The objective is to provide users access to relevant data in the shortest amount of time.

Data Mining

Data mining is extraction and discovery within large data sets to uncover anomalies, patterns and correlations using a broad array of machine learning techniques, statistics, and database systems to predict outcomes.

Five major elements of data mining are:

  1. Extract, transform and load
  2. Store and manage
  3. Access to analysts
  4. Analysis
  5. Visualization (graph, table, etc.)

Data Modeling

Data modeling is the process of preparing data for analysis by creating data models. These days Business intelligence applications are powerful and can automate traditional long and inefficient work of preparing, modeling, and summarizing data for analysis.

Data Types (structured vs unstructured vs semi-structured) >

Data Pipeline

A data pipeline is a group of data processing components connected in series, where output of one component acts as an input for the next one. An example of a data pipeline is automating, extracting, transforming, combining, validating, and loading data for analysis and visualization. Data pipeline processes in the pipeline are usually run parallel to reduce bottlenecks and latency while increasing data speed.

Data Profiling

Data profiling is the process of reviewing data and collecting statistical summaries to understand the structure, content, and interrelationships, and identifying potential for data projects. Data Profiling prepares data for visual analysis.

Data Security

Data security means protecting digital data against breaches by outside actors, unauthorized users and cyberattacks. Security applications used to protect data include data encryption, hashing, tokenization, and management practices.

Data Tables

Data table is a series of intersecting cells laid out in columns and rows with a header row in the columns describing the data below. Also known as a tabular report, data tables are used mainly to record information. A common type of data table is a cross-tab report where a column (usually the left one) organizes or groups data making it more intelligible and useful.

Data Transformation

Data transformation converts data from one format into another and is key to tasks such as data wrangling, data warehousing, data integration and application integration. In a data warehouse environment, it is the T in ETL (extract, transform, load).

Data Visualization > 

Data Warehouse

A data warehouse is an archive of historical information, a central repository of integrated data that is added to by a business or other organization over time. Data warehouses can be developed by systems that extract data from operating systems and integrate data from different sources. Some of the benefits of data warehouses are providing the opportunity to improve the quality of data and modifying the data structures to be understood by business users easily.

Decentralized BI

Decentralized BI stores and manages data and business intelligence independently by department. Every department would handle any data they need with less involvement from the center. On one hand organizations like to have a centralized BI for higher security and on the other hand, they like the benefit of data discovery by each department and a greater freedom in a decentralized BI system.

Drill Down

Drill down takes the user from a high level view of data to a more granular one allowing the user to go deeper into the data being analyzed, for example from country to state to city to zip code.

Drill Through

Drill through takes the user to a report relevant to the data being analyzed allowing the user to pass from one report to another while still analyzing the same data set.

Dynamic Dashboards

Rather than a single static user, a dynamic dashboard allows multiple users to access a data and dashboard simultaneously without affecting other users. Dynamic dashboards allow users to update and add new content such as import/export, create customized dashboard views, or integrate with platforms and SaaS applications.

E

Embedded Analytics >

Embedded Analytics ROI >

Embedded Business Intelligence (Embedded BI) >

Embedded Data Widgets

Widgets are simple, intuitive applications independent of the body of a website or device but easily embedded into it. Widget types include information, collection, control and hybrid. Data widgets display one object, or a list of objects using live data that may be programmed to respond to website identity.

Data widgets are made up of the following types:

  • Data view – the contents of exactly one object.
  • Data grid – a list of objects in a table format.
  • Template grid – a list of objects in a tile view.
  • List view – a list of objects.

Embedded Reports

An embedded report is one that is placed within an existing web page and is updated when the page is updated.

Embedded reports are the integration of data and reports within an application allowing different users to view relevant dashboards and data streams to make the data easier to digest and analyze. Embedded reporting makes it easier to access data in context; analyze the data and identify actionable insights efficiently in the applications they use every day.

Embedded Report Builder >

ETL: Extract Transform and Load

Extract, Transform, Load (ETL) is the process of copying data from one source into a database structured differently or in a different context than the original. ETL is a foundational concept in data warehousing. ETL enforces data quality and consistency so that disparate sources may be used to present consistent data for end users to make decisions

Get a demo of Qrvey

F

FHIR Data

FHIR (Fast Healthcare Interoperability Resources) refers to a standard defining resources (data formats and elements) and an application programming interface (API) for electronic health records (EHR) exchange. The standard was created by the Health Level Seven International (HL7) health-care standards organization. Learn more about FHIR analytics.

G

Google BigQuery

Google BigQuery is an Infrastructure as a Service (IaaS) from Google Cloud Platform that enables interactive analysis of massive data sets. It enables SQL-like queries using the processing power of Google. BigQuery enables users to access, save, and share complex datasets, to set limits and specify user permissions.

Governed Data Discovery >

H

HOLAP

What is HOLAP? Hybrid Online Analytical Processing (HOLAP) is a type of database technology that combines the advantages of both Relational Online Analytical Processing (ROLAP) and Multidimensional Online Analytical Processing (MOLAP). ROLAP stores data in a relational database, which is flexible and scalable, but can be slow and complex for data analysis. MOLAP stores data in a multidimensional database, which is fast and efficient for data analysis, but can be rigid and limited for data storage.

HOLAP stores some data in a relational database and some data in a multidimensional database, depending on the type and level of data processing required. This way, HOLAP can achieve the optimal balance between data storage and data analysis, providing high performance, accuracy, and flexibility for both summary and detailed data queries.

HOLAP is often used for business intelligence (BI) and online analytical processing (OLAP) applications, such as data warehousing, data mining, data visualization, and data reporting. HOLAP can help businesses and organizations gain insights and make decisions based on large and complex datasets.

I

Infrastructure Costs >

Inline Analytics >

Interactive Data Viewer

Interactive Data Viewer (IDV) can perform instant calculations, sort data in a table and visualize results graphically. IDVs are simple, flexible and interactive tools that answer business questions accurately and quickly. IDV may be found as part of a dashboard or in other reports.

Infused Analytics

What is Infused analytics? Infused analytics is analytics embedded within workflows becoming a fundamental part of the user experience bringing insight and action into the same context. Infused analytics is the last stage in the embedded analytics model. It is next to impossible to tell the difference between an application and the embedded analytics. Examples include personalization and live analytic content within applications.

K

Key Performance Indicators (KPIs)

What are key performance indicators? Key performance indicators (KPIs) are measurable values that determine how effectively an individual, team or organization is achieving a business objective. KPIs provide a focus for strategic and operational improvement, create an analytical basis for decision making and help focus attention on what matters most to a business or business unit.

Some examples of KPIs are:

  • Net profit: The amount of money left after deducting all expenses from revenue
  • Customer satisfaction: The degree to which customers are happy with a product or service
  • Employee retention: The percentage of employees who stay with an organization for a certain period of time
  • Website traffic: The number of visitors and sessions on a website
  • Project completion: The percentage of tasks or milestones completed within a project

L

LDAP

What is LDAP? LDAP (Lightweight Directory Access Protocol) is an open protocol for distributed directory information services including authentication. BI applications use it to look up information from a server.

Low-Code

What is low-code? Low-code is a development environment for applications and processes using a graphical user interface rather than computer programming. They allow users to create applications without formal software development or coding knowledge.

Get a demo of Qrvey

M

Machine Learning (ML)

What is machine learning? It’s a branch of AI, is the application of computer algorithms to allow systems to learn and improve through experience and the use of data to become more accurate at predicting outcomes without being explicitly programmed to do so.

Managed Reporting

Managed reporting is part of a BI model that unlike ad-hoc reports are developed by users with IT background and knowledge about CSV and SQL queries. The goal of managed reports is to ensure end-users achieve company’s goals with the emphasis on their strategies. A good managed report model provides the end-users with an easy way to do jobs such as spotting inefficiencies in manufacturing and taking proper action to correct them.

Metadata

What is metadata? Metadata is “data about data”. It summarizes information about other data. Types of metadata include descriptive, structural, administrative, reference, statistical and legal metadata.

Microservices

What are microservices? Microservices are application components that work together as a collection of loosely-coupled services connected via APIs to form a microservices-based application architecture. This architecture can be more agile and pluggable due to developing and scaling each microservices independently. Companies like Amazon, Netflix and Uber ascribe their success in part to microservices.

MOLAP

What is MOLAP? MOLAP stands for Multidimensional online analytical processing. It is a type of OLAP (online analytical processing) that directly accesses, and extracts data stored in a MOLAP directly. MOLAP indexes directly into a multidimensional collection of data. This process increases effectiveness of data extraction and query performance due to the way data has been stored.

MongoDB

What is MongoDB? It is known as a NoSQL database that uses JSON-like files with unstructured schemas. It enables businesses to access and scale their data more quickly and is one of the most popular databases to deal with flexible and large data requirements.

Multi-Tenancy >

Multi-Tenant Analytics >

N

Native Applications

What are native applications? Native applications are designed to run on a specific system where the data resides to accelerate performance and reduce complexity.

No-Code

No-code refers to applications that business users or analysts with less programming experience build using a graphical user interface without having to write programming code.

NoSQL

NoSQL, or “not only SQL,” is a database design providing flexible ways to store and retrieve structured, semi-structured, and polymorphic data without complex object relational mapping.

O

ODBC

What does ODBC stand for? ODBC stands for open database connectivity which is an application programming interface (API) for relational database management systems (DBMS). It also allows you to access big data from different sources.

OEM Embedded Analytics >

 

OLAP

What does OLAP stand for? OLAP stands for Online analytical processing – is the analysis of database information from multiple databases to answer multi-dimensional analytical queries fast. OLAP includes relational databases, report writing and data mining for business intelligence (BI) and decision support applications so users can ‘slice’ and dice data in multiple dimensions.

An example of OLAP is to help a user to answer questions like “how many cars have been sold in Los Angeles in 2020? And how many of those have been sold to people over 60 years old?

OLTP

What does OLTP stand for? OLTP stands for Online Transactional Processing – it is the processing transactional queries, maintaining data integrity and effectiveness (transactions per second). OLTP supports transactional applications in a 3-layer architecture. It can handle the day to day transaction of an organization. The ATM centers use an OLTP system.

For example, if two partners with a joint account access their account at the same time using different machines to withdraw the total money, the first person who completes the authentication first, will get the money. OLTP is used for tasks like inserting, updating and/or deleting small amounts of data in a database accessed by a large number of users.

Open Architecture

Open architecture is a platform that allows software developers to customize the look of their applications for their customers. Open architecture environment allows for the easy deployment and scaling that reduce the costs of technologies.

Operational Reporting

What are Operational reports? They are short-term reports that detail day to day activity of a business operations such as production reports, processes, costs and expenditures to support quick decision making.

P

Performance Dashboard

What is a Performance Dashboard? It is a management tool used to measure, monitor, and manage activities and processes critical to achieving business goals. Performance Dashboards may be used to alert users to potential problems, analyze problems in a timely manner.

Predictive Analytics

What is Predictive Analytics? A type of analytics that uses data mining and machine learning to make predictions about future events using historical data and statistical modeling.

The three models of predictive analytics include:

  • Descriptive
  • Predictive
  • Decision

Prescriptive Analytics

What is Prescriptive Analytics? A type of data analytics, it is the final phase of business analytics used to make decisions from immediate to long term. Prescriptive analytics includes descriptive and predictive analytics. For example, a car manufacturer could rely on more than company data by leveraging customer and historical trends and predictions.

Get a demo of Qrvey

Q

Queries

What is a data query? A request for data or information from a database is called a query

R

Report Building

What is report building? The act of creating reports for end-users to see, understand and act upon data. Report building is moving out of the IT department placing more and more report building power in the hands of the end-user.

Report Developer in BI

The person at an organization, usually in the IT department, tasked with developing reports for the organization focused on analysis and taking action. Report developers may be responsible for developing any number of report types such as dashboards, heat maps, actionable key performance indicators (KPIs), automatic report scheduling and business alerts. Effective report developers are tied with companys’ strategic goals.

Reporting Performance

Reporting performance is the speed and efficiency that reports are generated when users perform a query. Performance will vary according to system bandwidth, technical structure and architecture, concurrent users, data volume, data schema, report complexity, visualizations, etc.

Responsive Design

What is responsive design? Responsive design is the optimization of the user interface for viewing on any device screen (TV, desktop, tablet or mobile) or platform such as iOS, Google Android, and Windows.

REST API

What is a REST API? Representational State Transfer (REST) defines a set of constraints for how the architecture of an Internet-scale distributed hypermedia system, such as the Web, behaves such as the ability to handle multiple types of calls, return different data formats and even change structurally within hypermedia. REST normally uses HTTP for Web APIs.

ROLAP >

S

SaaS

SaaS stands for Software as a Service. It describes a cloud-based software application delivered over the internet via a website or apps. The software solutions are available to users while located in the provider’s server.

Scalability

What is scalability in SaaS applications? Scalability is the ability to quickly and easily increase or decrease IT resources to meet an increased or expanding workload.

Self-Service BI

What is self-service BI? Self-service business intelligence (BI) allows non-expert users to access and explore data to analyze, visualize and share reports and dashboards without depending on IT.

Self-Service Data Prep

What is self-service data prep? Self-Service Data Preparation (SSDP) makes advanced data discovery tools accessible to users no matter their skill level or IT know-how to rapidly access, blend, and prepare data for analysis

Single Sign-On

What is Single Sign-On? SSO allows users to log in using one set of credentials for any software system such as corporate apps, websites, and data to which they have been granted access.

T

Transactional Data

What is Transactional Data? Transactional data describes an event or sequence of exchange with three components – a time, a number value and an object(s). Data about product shipment is an example of transactional data.

W

Write-Back

What is a Data Write-Back? It enables users to update databases or data sources directly within a BI application.

Webhooks

What is a webhook? A webhook is a way to augment or alter a web page’s behavior or application with custom callbacks – an automated message sent when an event occurs.

Get a demo of Qrvey

Popular Posts

multi-tenant analytics

Why is Multi-Tenant Analytics So Hard?

BLOG

Creating performant, secure, and scalable multi-tenant analytics requires overcoming steep engineering challenges that stretch the limits of...

What is Multi-Tenant Analytics >

How We Define Embedded Analytics

BLOG

Embedded analytics comes in many forms, but at Qrvey we focus exclusively on embedded analytics for SaaS applications. Discover the differences here...

What is Embedded Analytics >

embedded analytics for startups

White Labeling Your Analytics for Success

BLOG

When using third party analytics software you want it to blend in seamlessly to your application. Learn more on how and why this is important for user experience.

White Label Analytics >