Search
Wednesday 21 August 2019
  • :
  • :

Technology Terms You Should Know!

Technology Terms You Should Know!

More great Tech-Terms from our friends at Tech Target

Continuous Delivery

Continuous delivery (CD) is an extension of the concept of continuous integration (CI). Whereas CI deals with the build/test part of the development cycle for each version, CD focuses on what happens with a committed change after that point. With continuous delivery, any commit that passes the automated tests can be considered a valid candidate for release.

An important goal of continuous delivery is to make feedback loops as short as possible. Because code is delivered in a steady stream to user acceptance testing (UAT) or the staging environment, cause and effect can be observed early and code can be tested for all aspects of functionality, including business rule logic (something unit tests can’t do reliably).

If an iterative process is becoming unwieldy due to increasing project complexity, CD offers developers a way to get back to doing smaller, more frequent releases that are more reliable, predictable and manageable. When CD is ongoing and testing occurs early, a concept sometimes referred to as “shift left,” developers can start working on fixes before they have moved on to another aspect of the development project. This can help increase productivity because it minimizes the effort that’s required for developers to refocus on the initial task.

Q&A on Continuous Delivery with Windows and .Net

A New Way to Do Continuous Delivery With Maven and Jenkins Pipeline

Fast Data

Fast data is the application of big data analytics to smaller data sets in near-real or real-time in order to solve a problem or create business value. The term fast data is often associated with self-service BI and in-memory databases. The concept plays an important role in native cloud applications that require low latency and depend upon the high I/O capability that all-flash or hybrid flash storage arrays provide.

The goal of fast data analytics is to quickly gather and mine structured and unstructured data so that action can be taken. As the flood of data from sensors, actuators and machine-to-machine (M2M) communication in the Internet of Things (IoT) continues to grow, it has become more important than ever for organizations to identify what data is time-sensitive and should be acted upon right away and what data can sit in a database or data lake until there is a reason to mine it.

In the future, it is expected that some fast data applications will rely on rapid batch data while others will require real-time streams. Potential use cases for fast data include:

  • Smart grid applications that can analyze real-time electric power usage at tens-of-thousands of locations and automatically initiate load shedding to balance supply with demand in specific geographical areas.
  • Smart window display applications that can identify a potential customer’s demographic profile and generate a discount code or other special offer for him when he enters the store.
  • Smart surveillance cameras that can continuously record events and use predictive analytics to identify and flag security anomalies as they occur.

Which Do We Need More: Big Data or Fast Data?

Fast data: The next step after big data

Data Modeling

Data modeling is the process of documenting a complex software system design as an easily understood diagram, using text and symbols to represent the way data needs to flow. The diagram can be used as a blueprint for the construction of new software or for re-engineering a legacy application.

Traditionally, data models have been built during the analysis and design phases of a project to ensure that the requirements for a new application are fully understood. A data model can be thought of as a flowchart that illustrates the relationships between data. Although capturing all the possible relationships in a data model can be very time-intensive, it’s an important step that shouldn’t be rushed. Well-documented conceptual, logical and physical data models allow stake-holders to identify errors and make changes before any programming code has been written.

Data modelers often use multiple models to view the same data and ensure that all processes, entities, relationships and data flows have been identified. There are several different approaches to data modeling, including:

Conceptual Data Modeling – identifies the highest-level relationships between different entities.

Enterprise Data Modeling – similar to conceptual data modeling, but addresses the unique requirements of a specific business.

Logical Data Modeling – illustrates the specific entities, attributes and relationships involved in a business function. Serves as the basis for the creation of the physical data model.

Physical Data Modeling – represents an application and database-specific implementation of a logical data model.

Simplifying MySQL Database Design using a Graphical Data Modeling Tool

Big data challenges traditional data modeling techniques

 

Data-Driven Decision Management

Data-driven decision management (DDDM) is an approach to business governance that values decisions that can be backed up with verifiable data. The success of the data-driven approach is reliant upon the quality of the data gathered and the effectiveness of its analysis and interpretation.

In the early days of computing, it usually took a specialist with a strong background in technology to mine data for information because it was necessary for that person to understand how databases and data warehouses worked. If a manager on the business side of an organization wanted to view data at a granular level, she had to reach out to the information technology department (IT) and request a report. Someone from the IT department would then create the report and schedule it to run on a periodic basis. Because the process was complex, ad hoc reports (also known as one-off reports) were discouraged.

Today, business intelligence tools often require very little, if any, support from the IT department. Business managers can customize dashboards to display the data they want to see and run custom reports on the fly. The changes in how data can be mined and visualized allows business executives who have no technology backgrounds to be able to work with analytics tools and make data-driven decisions.

Data-driven decision management is usually undertaken as a way to gain a competitive advantage. A study from the MIT Center for Digital Business found that organizations driven most by data-based decision making had 4% higher productivity rates and 6% higher profits. However, integrating massive amounts of information from different areas of the business and combining it to derive actionable data in real time can be easier said than done. Errors can creep into data analytics processes at any stage of the endeavor, and serious issues can result when they do.

 

Healthcare has continued to expand and enhance their data-decision management :

Here is an recent example of what population health is doing to help patients:

Why Suicide Prevention Is Part of Population Health Strategy

 




Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.