Virtualization Technology News and Information
Article
RSS
Small Data - The Modern 'Big-Data'

When working our way through big data in an industrial ecosystem that is connected and digitized, does all that information provide amplified insights? Do actionable insights grow at the same pace as the data? Does bigger-volume and variety of data correlate directly to insights that are more precise?  Gartner research from May 2021 predicts that 70 percent of organizations will shift focus away from "big-data" to "small and wide data" by 2025. Another survey by Seagate, finds that 68 percent of the data in big-data systems are unutilized.

So, while big datasets with variety and volume are still needed for deeper analysis using business intelligence and statistical methods, the speed of modern business and the agility of response needed, has forced enterprises to shift focus to analysing a wider set of data sources that produces significantly smaller amounts of data, using mathematics and AI.

Enter small and wide data

Small data is a subjective measure. People define it as datasets small enough in volume and format to make them accessible, informative, actionable, and comprehensible. Small data typically addresses a specific question or addresses a specific problem. Examples of small data include baseball scores, inventory reports, driving records, sales data, biometric measurements, search histories, weather forecasts, usage alerts etc. Each of these can be used to provide specific actionable real-time prescriptive insights for a course of actions.

The real value of small and wide data analysis lies in its ability to help make comparisons and extract specific insights using single, small data components to open up new use cases based on less training data. It offers a departure from data-hungry analytical models towards alternative techniques that thrive well in diversity of narrow-band data, such as time-series analysis.

Here are some considerations for small data.

Autonomous production line

A great example of effectively leveraging ‘small data' is its use in self-driving vehicles, where the main source of data revolves around a sophisticated system of network-based structures that pull information from outside the car. The car creates and maintains data based on sensors and cameras placed in different areas in and around itself. In real-time the AI system produces a prescriptive insight that executes a definitive decision, as you can't afford to create a set of predictive options for stop, go or swerve, for times when there is an obstacle on the way.

The same technique can be leveraged in a manufacturing plant, where conventionally human decision-making is needed at the end of the production line.

Hyper-personalized propositions

Since small data works at the level of the individual business, problem, human being even, and its insights pertain exactly to each of these.  Leveraging this, organizations can take actions that are hyper-personalized to the needs and context of a specific customer or employee. One example is using small and wide data to predict with greater accuracy whether a customer will be receptive to a marketing campaign or not, and act accordingly.

Data privacy

Lending strength to the factors favouring small and wide data is the concern around data privacy and protection. Regulators and consumers alike are increasingly apprehensive about how personal information - invariably big data - could be misused. This is driving up entry barriers to big data in the form of stricter laws and lower willingness among consumers to share data with third parties.

Language translations in a global enterprise

Translating one spoken language to another is a fascinating neural net problem, as there are many permutations, which are impacted by grammar, style, geography and etymology. However, the key to improving accuracy and semantics on some of these techniques, such as ‘transfer learning' is to gradually train on smaller and cleaner subsets of data.

In this decade, the data-economy of the past is giving way to the economics of AI, where cognitive actions otherwise executed by humans in an enterprise are being executed at scale, with more accuracy and with more accountability by robots. The big-data revolution which fuelled the rise of business intelligence and data science for predictive insights, is being side-lined by small data at real-time for driving prescriptive actions.

##

ABOUT THE AUTHOR

Gary Bhattacharjee 

Gary Bhattacharjee heads up the Global Practice for Artificial Intelligence (AI) at Infosys, enabling businesses with AI-led solutions. Gary started at IBM Australia, leading product development for CoreBank©. At Citi, he led the development of the Corporate Banking platform. Leading Financial Services Consulting at HP, he developed various strategic solutions for the industry. At Morgan Stanley, he led Strategy and Analytics for Wealth Management, achieving business goals through AI. He recently cofounded a FinTech startup where he built an autonomous platform to digitize manual paper-based operations of Trade Finance, with the use of machine learning and block-chain. Gary graduated from the Indian Institute of Technology (IIT) with a Bachelor's in Electronics.  He holds a patent on Management of Data via Cooperative Method and System, a wiki-based approach for managing structured data.

Published Wednesday, September 08, 2021 7:35 AM by David Marshall
Filed under: ,
Comments
There are no comments for this post.
To post a comment, you must be a registered user. Registration is free and easy! Sign up now!
Calendar
<September 2021>
SuMoTuWeThFrSa
2930311234
567891011
12131415161718
19202122232425
262728293012
3456789