DATA AS CAPITAL: THE VALUE OF INFORMATION IN THE DIGITAL ECONOMY

Annotasiya

This report posits that data has emerged as a distinct and critical form of capital, standing alongside traditional financial and human capital in its capacity to generate new digital products and services. Economically, capital is defined as a produced good necessary for further production, and data—as recorded information—literally fulfills this criterion.  

Manba turi: Konferentsiyalar
Yildan beri qamrab olingan yillar 2022
inLibrary
Google Scholar
Chiqarish:
125-148
7

Кўчирилди

Кўчирилганлиги хақида маълумот йук.
Ulashish
Qosimov, M. (2025). DATA AS CAPITAL: THE VALUE OF INFORMATION IN THE DIGITAL ECONOMY. Решение социальных проблем в управлении и экономике, 4(8), 125–148. Retrieved from https://www.inlibrary.uz/index.php/sspme/article/view/109451
Crossref
Сrossref
Scopus
Scopus

Annotasiya

This report posits that data has emerged as a distinct and critical form of capital, standing alongside traditional financial and human capital in its capacity to generate new digital products and services. Economically, capital is defined as a produced good necessary for further production, and data—as recorded information—literally fulfills this criterion.  


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

125

DATA AS CAPITAL: THE VALUE OF INFORMATION IN THE DIGITAL

ECONOMY

Qosimov Mador

Tashkent State University of Economics

Tashkent, Uzbekistan

https://doi.org/10.5281/zenodo.15687907

Abstract

This report posits that data has emerged as a distinct and critical form of

capital, standing alongside traditional financial and human capital in its capacity
to generate new digital products and services. Economically, capital is defined as
a produced good necessary for further production, and data—as recorded
information—literally fulfills this criterion.

Data possesses unique characteristics that differentiate it from

conventional capital. It is inherently non-rivalrous, meaning a single dataset can
be utilized concurrently by multiple entities or applications without diminishing
its utility or availability for others, which enables significant economies of scale
and scope. Furthermore, data is non-fungible; individual pieces or datasets are
often unique and cannot be easily substituted, highlighting its scarcity and
unique informational content, contrary to the common perception of data as
merely abundant. The true economic value of data often becomes apparent only
after it has been processed, analyzed, and applied, making it an "experience
good" that introduces an inherent degree of uncertainty and risk in data-related
investments. While initial investments in data collection and infrastructure are
substantial, the marginal cost of collecting additional data or replicating existing
datasets is comparatively low, creating significant barriers to entry for smaller
organizations. To unlock its full value, data necessitates additional investments
in complementary assets, including advanced software, hardware, and
specialized human capital.

The report delves into quantitative and qualitative approaches for valuing

data, analyzes its profound economic impacts at both firm and macroeconomic
levels, and addresses the complex governance issues surrounding data
ownership, access, and regulation. A notable observation is the paradox of data
abundance and scarcity. While data is voluminous, characterized by high
volume, velocity, and variety, its truly valuable components are scarce and
unique. This inherent scarcity of actionable data is what confers competitive
advantage and necessitates careful, nuanced approaches to valuation and
governance. This dynamic suggests that policy should not exclusively focus on
promoting data collection but also on mechanisms that encourage the creation,


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

126

sharing, and identification of genuinely high-quality, unique, and actionable data.
It also highlights the importance of addressing data hoarding and ensuring data
quality, as low-quality data, regardless of its volume, will have limited
productive potential.

The findings indicate that data demonstrably drives productivity and

innovation, yet its unique economic characteristics pose significant challenges to
traditional accounting, valuation, and taxation systems. Policymakers face
critical trade-offs, particularly between fostering innovation and ensuring
privacy, and mitigating monopolization risks while promoting economic growth.
The report argues for the imperative of institutional innovation, including the
development of data trusts and public data commons, alongside ethical
valuation models, to ensure equitable value creation and address emerging
concerns like digital colonialism.

Introduction

The digital age has fundamentally reshaped economic landscapes,

introducing a new form of productive asset: data. Data is not merely a byproduct
of digital interactions; it is literally a form of capital, defined in economics as a
"produced good... necessary for the production of another good or service". In
this context, data capital is the "recorded information necessary to produce a
good or service". This conceptualization contrasts sharply with traditional forms
of capital.

Physical capital, such as machinery or buildings, consists of tangible assets

that are rivalrous, meaning they can only be used by one entity at a time, and
fungible, implying they can be substituted for one another. Their value is often
inherent in their physical form. Human capital, conversely, is embodied in
individuals through their skills, knowledge, and experience, with its value
realized through human effort and intellectual contribution. Data capital,
however, is an intangible asset that exhibits unique properties. It is non-
rivalrous, allowing for simultaneous use by multiple parties without depletion,
and non-fungible, possessing unique informational content that makes
individual pieces or datasets irreplaceable. Furthermore, its value is often
realized only after application and analysis, classifying it as an "experience
good" that can yield value over many years. As information without physical
form, data requires sophisticated storage in various media to be utilized
effectively. Novel methodologies

The rise of the data economy marks a significant shift in global economic

activity. This global digital ecosystem is characterized by the systematic


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

127

collection, organization, and exchange of data to generate economic value. This
era is profoundly shaped by automation, digitization, continuous knowledge
discovery, an abundance of information, and increased investments in research,
science, and education. Several key drivers underpin this transformation:

Platforms:

Online vendors, social media networks, search engines, and

payment gateways serve as primary conduits for collecting vast quantities of
raw data, forming the backbone of data generation.

Internet of Things (IoT):

The proliferation of connected devices,

including GPS trackers and various sensors, continuously generates diverse
streams of data, dramatically expanding the scope and granularity of data
collection across industries.

Artificial Intelligence (AI) Ecosystems:

Advances in AI and machine

learning have fundamentally enhanced the capacity to measure, understand, and
derive actionable insights from data. This transformation is pivotal in shifting
data from a mere cost center to a strategic capital asset. The emergence of an
"algorithm economy," where algorithms themselves are traded as valuable
commodities, further underscores this profound shift.

A significant observation in this evolving landscape is the symbiotic

relationship between data and AI/platforms. AI and machine learning have
dramatically improved the ability to measure and understand data, thereby
enabling its transformation into data capital. Concurrently, the data economy is
fundamentally driven by AI and platforms. This is not a one-way causal
relationship but a powerful feedback loop. The increasing volume and variety of
data generated by platforms and IoT devices provide the essential fuel for
training and refining AI models. In turn, more sophisticated AI capabilities
enhance the ability to extract deeper insights and value from this data, making
the data itself more productive and valuable as capital. This increased value
incentivizes further data collection and platform development, creating a self-
reinforcing cycle that accelerates the growth and evolution of the digital
economy. This dynamic implies that policies aimed at fostering the data
economy must adopt a holistic view, supporting not only data availability and
quality but also the development of AI technologies and robust platform
ecosystems. Conversely, any regulatory or economic friction applied to one
component, such as restricting data access, could have ripple effects, stifling
innovation and growth across the entire data-driven value chain.

Against this backdrop, this report seeks to answer critical questions:

How can data be valued as an economic asset?


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

128

What are the macroeconomic implications of data-driven productivity?

How should policymakers manage data ownership and access?

The subsequent sections will provide a comprehensive roadmap, beginning

with a detailed literature review, followed by a discussion of the methodology
employed. The core of the report will then present an in-depth analysis of data
valuation models, illustrate data's practical application through various case
studies, and explore its macroeconomic implications. The report will conclude
with a discussion of the challenges, trade-offs, and future policy directions
necessary to navigate the complexities of the data as capital era.

Literature Review

The emergence of data as a pivotal economic asset necessitates a thorough

examination of existing economic theories and empirical work. This section
reviews the academic and institutional literature relevant to understanding
data's role as capital.

Economic Theories on Intangible Capital and Knowledge Economies

The contemporary economy is increasingly characterized as a "knowledge-

based economy" where the production of goods and services is predominantly
driven by knowledge, with a growing emphasis on intangible assets and
intellectual capital. This era is marked by automation, digitization, continuous
knowledge discovery, an abundance of information, and increased investments
in research, science, and education.

Fundamental properties distinguish intangible capital, including data, from

physical assets:

Non-rivalry in Use:

A core property of intangibles is their ability to be

used simultaneously in multiple places and as inputs in various production
processes without depletion. This is facilitated by their storage in diverse media,
such as digital media for complex databases, which allows for widespread
application and leads to significant economies of scale and scope.

Limited Excludability:

Unlike physical assets, intangibles are often

difficult to exclusively control or prevent others from copying or imitating. The
degree of excludability varies, with patents offering more protection than an
open-source operating system, and is influenced by both technological features
of storage and the institutional environment.

Creating intangible capital requires investment, for instance, in research

and development, marketing, or human capital accumulation. However, the
relationship between investment and accumulated intangible capital is less


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

129

certain than for physical capital, which has historically led to their distinct
accounting treatment.

A significant observation in this domain is the inherent tension between

non-rivalry and excludability in data's value capture. Data's non-rivalrous
nature means it can be used by many simultaneously without being consumed,
enabling immense scalability and widespread value creation, as the same data
can fuel countless applications and insights. However, its limited excludability
means that data owners struggle to fully capture or appropriate the value they
generate. This difficulty in appropriation can lead to underinvestment in data
creation and quality, or, conversely, a drive towards monopolistic control to
enforce excludability and secure returns. The "spillover" potential of data is a
direct consequence of this tension. This tension is at the heart of the policy
challenge for data governance: how to design legal and institutional frameworks
that maximize the social value derived from data's non-rivalry, for example,
through promoting data sharing, open data initiatives, and public data commons,
while simultaneously ensuring sufficient private incentives, through appropriate
excludability mechanisms like intellectual property rights or data trusts, for data
collection, quality improvement, and innovation. This highlights the critical role
of legal frameworks in unlocking data's full economic potential.

Previous Work on the Monetization and Valuation of Data

Data

monetization

is

defined as the conversion of
intangible data value into
tangible

economic

benefits,

typically through direct sale, the
creation of new products or
services, or cost reduction. The
global data monetization market
is

experiencing

significant

growth, valued at $2.99 billion in 2023 and projected to reach $11.83 billion by
2032. This trajectory indicates substantial opportunities for revenue expansion
across various sectors.

Despite this considerable potential, many companies face significant

challenges in extracting measurable value from their data. Key obstacles
identified in the literature include a lack of skilled talent, pervasive poor data
quality, and data being trapped in corporate silos. Sustaining data monetization


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

130

initiatives is particularly difficult due to persistent data quality issues, which
often undermine initial successes.

A critical observation is that data quality serves as a prerequisite for

monetization and value realization. Snippet explicitly lists "poor data quality"
and "data stuck inside corporate silos" as primary reasons why organizations
struggle to realize tangible and measurable value from their data. This is
reinforced by and, which state that "A critical prerequisite for successful data
monetization is the establishment and maintenance of clean, high-quality data."
Furthermore, highlights "low quality, nonavailability, and lack of integration
(fragmentation)" as major barriers to effective data use in healthcare. This
indicates that data quality is not merely a technical detail but a fundamental
economic bottleneck. If data is to function as productive capital, its quality
directly determines its utility and the efficiency with which it can generate value.
Poor quality data is analogous to faulty machinery; it cannot produce reliable or
valuable output. This underscores that investments in data quality, robust data
governance frameworks, and initiatives to break down data silos are not merely
operational expenses but critical capital investments. These investments directly
unlock the productive potential of data, enabling successful monetization and
driving competitive advantage. Policy considerations should therefore include
incentives for developing and adhering to high data quality standards, especially
for data intended for sharing or public use.

Role of Data in Firm-Level and National Productivity Growth

Recent research indicates

that a disproportionately small
number of "Standout" firms
contribute the majority of
productivity growth at both
firm and national levels. For
instance, fewer than 100 such
firms accounted for two-thirds
of positive productivity gains
in a sample of 8,300 large
firms across Germany, the UK, and the US. These finding challenges conventional
wisdom, which often assumes a more gradual and widespread diffusion of
productivity gains across the economy.

Productivity growth in these leading firms tends to occur in "powerful

bursts," driven more by bold strategic moves, top-line growth, and portfolio


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

131

shifts, characterized as "doing things differently," rather than incremental
efficiency improvements, or "doing things more efficiently". This suggests that
the most impactful productivity gains arise from fundamental reconfigurations
of business models and value propositions, often enabled by innovative uses of
data. Additionally, the efficient reallocation of employees from less productive to
more productive firms play a significant role in overall national productivity
growth, a phenomenon particularly evident in the US context.

A compelling observation arising from this research is the "power law"

distribution of data-driven productivity. The consistent finding that a very small
number of "Standout" firms are responsible for the overwhelming majority of
productivity growth indicates a pattern where the benefits and capabilities
derived from leveraging data and AI are highly concentrated among a few
leading entities. This is not simply about general technological adoption; it is
about specific firms making transformative strategic moves that fundamentally
redefine value creation, often through innovative uses of data. This
concentration of productivity gains has significant macroeconomic implications,
potentially exacerbating economic inequality and leading to increased market
monopolization. It suggests that traditional policies focused on broad diffusion
of technology might be insufficient. Instead, policymakers need to consider
conditions that enable "Standouts" to emerge and scale, while simultaneously
addressing the negative externalities of market concentration and ensuring that
the benefits of data-driven growth are more broadly shared across the
economy.

Frameworks from Institutions like OECD, IMF, WEF on Data

Governance

International institutions have increasingly recognized the critical

importance of data governance in the digital economy, developing frameworks
to guide governments and stakeholders.

OECD (Organisation for Economic Co-operation and Development):

The OECD's framework emphasizes treating data as a strategic asset. Key
principles include eliminating obstacles to data management, sharing, and reuse;
fostering innovation in public policy and service design; and prioritizing open
data. It advocates for reinforcing trust across the data ecosystem, stimulating
investment, and promoting responsible data use, including across borders. The
framework is structured in strategic, tactical, and delivery layers, covering
organizational, policy, and technical elements. It also highlights the importance


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

132

of data standards to prevent loss, avoid misinterpretation, promote efficiency,
and facilitate sharing.

IMF (International Monetary Fund):

The IMF's engagement focuses on

good governance for macroeconomic stability, including anti-corruption efforts.
It promotes data transparency, quality, and timeliness through initiatives like
the Special Data Dissemination Standard (SDDS) and General Data
Dissemination System (GDDS). The IMF is also concerned with building trust in
the digital economy, particularly regarding privacy issues, suggesting that
privacy technologies can form the basis of trust if designed appropriately.
Crucially, the IMF argues for international agreement on common minimum
principles for the data economy to mitigate policy divergences that arise from
different national contexts and priorities. The IMF recognizes data as non-rival,
capable of supporting efficiency and inclusion, but also notes its potential for
price discrimination and algorithmic biases.

WEF (World Economic Forum):

The World Economic Forum promotes

"Data for Common Purpose," advocating for flexible data governance models
that allow for differentiated permissioning of data based on context. It supports
government-led data exchanges to facilitate the transition to a data-driven
economy. The WEF highlights open data governance as essential for
accountability, regulatory compliance, data quality standards, and transparency
regarding data use. Principles of open data governance include data being open
by default, timely and comprehensive, accessible and usable, and comparable
and interoperable.

A significant observation is the global imperative for harmonized data

governance amidst divergent national priorities. Multiple international bodies
are actively developing and promoting data governance frameworks. The IMF, in
particular, explicitly calls for "international agreement on common minimum
principles for the data economy" to reduce "policy divergences". This indicates a
recognition that data, being inherently non-rivalrous and easily flowing across
borders, necessitates a coordinated global approach. Without such
harmonization, disparate national data policies, such as strict data localization
requirements or varying privacy regulations, could create significant
fragmentation, hindering cross-border data flows which are fundamental to
global economic activity. The underlying challenge is balancing national
sovereignty and diverse domestic priorities, for example, the EU's strong privacy
focus versus the US's emphasis on innovation, with the global, interconnected
nature of data as capital. A failure to achieve consistent global data governance


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

133

could become a major impediment to realizing the full economic potential of
data as capital, leading to digital protectionism, increased operational costs for
multinational firms, and a slowdown in innovation diffusion. This necessitates
sustained diplomatic efforts and multi-stakeholder collaboration to establish
interoperable frameworks and common principles, rather than attempting to
impose monolithic global rules.

Gaps in Understanding How Data Functions as Long-Term Productive

Capital

Despite the growing recognition of data as capital, significant gaps remain

in the understanding and measurement of its function as a long-term productive
asset.

Measurement Error in Capital:

A pervasive challenge in productivity

analysis is the inherent measurement error in constructing capital stock,
particularly for intangible assets. These errors accumulate over time and can
lead to an underestimation of capital's true contribution to output and economic
growth.

Inter-temporal Allocation:

Unlike other production inputs, capital

purchases, including intangible ones like data, are consumed over extended
periods. This necessitates complex inter-temporal allocation of expenditures
based on an asset's expected lifespan, a challenge not present with inputs
consumed contemporaneously.

Lack of Market Valuation:

Many intangible assets, including data, are not

actively traded in liquid markets, making it difficult to establish clear market
valuations. Current national accounting practices often resort to estimating their
value based on the cost of inputs used for their production.

Depreciation of Intangibles:

While physical assets experience wear and

tear, intangible assets like software or data do not physically deteriorate. Their
economic value diminishes primarily due to obsolescence, for example,
becoming technically outdated, changing preferences, or the introduction of
superior alternatives. Accurately measuring this decline is crucial for reliable
productivity statistics
and national accounts.
Novel methodologies,
such as using Google
Search Volume data, are
being

explored

to

estimate depreciation


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

134

rates for digital intangibles, with findings suggesting that newer digital assets
may exhibit steeper depreciation due to shorter lifecycles.

Challenges in Accounting:

Traditional accounting frameworks struggle

to fully incorporate "missing capitals" like environmental, human, and intangible
assets. Issues include the absence of market prices, difficulties in comparing
non-market valuations with exchange values, and complexities in attributing
benefits across different economic owners. The inconsistent use of "shadow
prices" further complicates efforts to capture the broader social welfare value of
these assets.

Data Quality and Linkage Issues:

Low data quality, non-availability, and

fragmentation significantly hinder research and effective decision-making.
Furthermore, big data sources can be highly volatile and selective, with coverage
changing daily, leading to inexplicable jumps in time-series data. Often,
individual observations in big data sets lack linking variables, severely limiting
the ability to correct for selectivity or integrate with other datasets for
comprehensive analysis.

A fundamental observation is the "invisible hand" of data capital and its

measurement gap. Data is consistently identified as a central and essential
driver of economic activity and value creation across increasingly more sectors
of contemporary capitalism. However, traditional national accounting systems
and economic measurement tools struggle to fully capture its value, investment,
and depreciation. This creates a significant "measurement gap," rendering the
true contribution of data to GDP and productivity partially "invisible" or
systematically underestimated. The shift towards a "weightless economy,"
where intangible assets play an increasingly central role, means that reliance on
outdated metrics provides an incomplete and potentially misleading picture of
economic reality. This persistent measurement gap has profound implications
for effective policymaking. Decisions regarding investment, innovation, and
economic growth are being made based on an incomplete understanding of the
economy's true drivers. It necessitates fundamental methodological innovation
in national statistics, such as the upcoming 2025 System of National Accounts
(SNA) expanding to include data assets, and proposals for a Gross Domestic
Knowledge Product. This highlights the critical role of a re-evaluation of how
investment in data is conceptually treated. The challenge is not merely technical
but deeply conceptual, requiring economists and statisticians to adapt their
frameworks to the unique and dynamic properties of data.

Methodology


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

135

To comprehensively explore data as capital, this report employs a multi-

faceted methodology grounded in economic theory and supported by diverse
data sources.

Theoretical Framework: Economic Models of Capital Valuation

Adapted to Data

The theoretical framework adapts established economic models of capital

valuation to account for the unique characteristics of data as capital. This
involves extending concepts from the literature on intangible capital, which
already grapples with the challenges of non-rivalry and limited excludability.
The approach considers data as a fundamental factor of production,
acknowledging its potential for increasing returns at the macro level due to its
non-rivalrous nature and spillovers.

Specific adaptations include:

Production Function Analysis:

Traditional production functions, such as

the Cobb-Douglas model, are modified to explicitly incorporate data as an input.
This adaptation accounts for data's unique properties and the potential for
measurement error. The analysis explores methods to estimate data's marginal
productivity, quantifying the additional output generated by an incremental unit
of data, similar to how the marginal product of labor or physical capital is
assessed.

Asset Pricing Models:

Models are adapted to reflect data's "experience

good" nature and its significant option value. This recognizes that the true value
of data is often uncertain at the point of collection and is realized only after it
has been put to use, potentially yielding unforeseen benefits in the future.

Depreciation Models:

Methodologies for assessing the economic

depreciation of data assets are incorporated, focusing on obsolescence rather
than physical wear and tear. This may leverage novel data sources like Google
Search Volume to estimate the decline in relevance and economic utility of
digital intangibles over time.

Data Sources

The analysis draws upon a combination of primary and secondary data

sources to provide a comprehensive view of data as capital.

Corporate Disclosures:

Annual reports, investor presentations, and

public filings from leading technology firms and platforms (e.g., Google, Amazon,
Microsoft, Meta) are analyzed. While these documents do not explicitly value
"data assets" as a distinct line item, they provide implicit information into how
companies leverage data for revenue generation, growth, and strategic


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

136

advantage. This is often evident through discussions of user growth, research
and development investments, and the valuation of intangible assets.

National Accounts:

Macroeconomic data from international bodies such

as the International Monetary Fund (IMF), World Bank, and OECD, as well as
national statistical offices like Eurostat and Statistics Canada, are utilized. This
includes digital economy indicators, gross fixed capital formation (GFCF), Gross
Domestic Product (GDP), and capital stock estimates. This data is crucial for
analyzing macroeconomic implications and understanding the evolving
treatment of intangible capital in official national statistics.

Case Studies:

In-depth examinations of specific industries and firms

demonstrate data's practical application and value creation. For example:

o

E-commerce (Amazon):

Analysis of how Amazon leverages data for

hyper-personalization, demand forecasting, and optimized inventory
management, contributing to its record profits.

o

Digital Health (NHS England):

Exploration of the Foresight AI project,

illustrating how de-identified NHS data is used for predictive healthcare,
population health insights, and improving healthcare efficiency and fairness.

o

Fintech:

Examples of data-driven innovations such as AI-powered

chatbots, Robotic Process Automation (RPA), and advanced fraud detection
systems, highlighting their impact on banking efficiency and customer
satisfaction.

o

Big Tech (Google):

Examination of how Google utilizes big data

analytics for enhancing search algorithms, driving product development, and
enabling targeted advertising, which reinforces its market dominance.

A key observation is the challenge of "proxying" data value from traditional

financial statements. Corporate disclosures primarily report traditional financial
metrics like revenue, EBITDA, or user growth, rather than explicitly valuing
"data assets" as a distinct line item. This means that to understand data's
contribution, researchers must infer its impact on these traditional metrics,
essentially "proxying" its value. For example, Amazon's personalization engine
drives higher conversion rates and revenue, but the specific monetary value of
the underlying data is not directly reported. This makes direct attribution of
value to data extremely difficult, as its impact is deeply embedded across various
aspects of the business model. This inherent difficulty in direct measurement
highlights a critical gap between the economic reality of data as capital and
current accounting practices. It underscores the urgent need for new accounting
standards, such as the anticipated 2025 SNA inclusion of data assets, and for


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

137

companies to develop sophisticated internal data valuation frameworks.
Without such advancements, financial markets may continue to undervalue or
misprice data-intensive companies, and firms themselves may not optimally
manage or invest in their data assets, leading to inefficient capital allocation and
missed opportunities. The "invisible" nature of data capital in traditional
financial reporting poses a significant risk for investors and a challenge for
regulatory oversight.

Optional: AI or Econometric Tools

Where applicable, the methodology may incorporate advanced analytical

tools. AI and machine learning techniques can be employed for simulating data-
driven growth, asset pricing, or identifying valuable data segments. This
includes predictive modeling of economic impact, clustering, feature importance
analysis, and data quality detection. Econometric tools are utilized to address
challenges such as measurement error in capital and to rigorously analyze the
contribution of data to productivity growth, ensuring statistical robustness in
the findings.

Analysis and Findings

This section presents the core analysis, detailing data valuation models,

illustrating data's practical application through case studies, and examining its
macroeconomic implications.

a) Data Valuation Models

Quantifying the economic value of data is a complex but essential

undertaking. Several models have been proposed to address this challenge.

Marginal Productivity Approach:

This model values data based on the

additional output or revenue it generates. Analogous to how firms pay for labor
or physical capital based on their marginal product, this approach seeks to
quantify the incremental value added by an additional unit or improved quality
of data, holding other inputs constant. For instance, a bakery might calculate the
marginal product of an additional worker by observing the increase in cakes
produced. Applying this to data, one might analyze how an increase in the
volume or quality of customer data leads to a measurable increase in sales or
operational efficiency. Challenges include isolating data's specific marginal
contribution from other complementary investments, such as new software or
skilled personnel, and the non-fungible nature of data, which makes defining a
"unit" of data for marginal analysis difficult.

Market-Based Pricing (Market Value of Information - MVI):

This

approach values data by comparing it to the prices of similar data assets that


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

138

have been recently traded in a market. It operates on the economic principle of
the "law of one price," positing that identical goods in an efficient market should
have only one price. Methodologies include comparable sales analysis,
benchmarking against publicly traded data-intensive companies, or analyzing
precedent transactions.

o

Advantages:

This method benefits from simplicity, as it relies on

observable market transactions. It reflects current market conditions and
incorporates multiple factors influencing market perception. It is also widely
accepted for valuing tangible and traditional intangible assets like patents and
copyrights.

o

Disadvantages:

The data industry is nascent, and there are not always

sufficient comparable products or companies to serve as reliable benchmarks
for valuation. Historically, data brokers have not disclosed their prices, although
this is changing with the rise of data-as-a-service. The non-fungible nature of
data also means that similar datasets may still hold different values due to subtle
differences in information, making direct comparisons challenging.
Furthermore, this model often fails to account for the value of consumer-
generated data for which individuals are not directly compensated.

Cost-Replacement Approach (Cost Value of Information - CVI):

This

model measures the value of data based on the costs associated with its
production, storage, and replacement. It is based on the principle of substitution,
suggesting that an investor would pay no more for an asset than the cost to
replace it with a new one of equal utility.

o

Components:

CVI includes the cost to produce and store the data, the

cost to replace it if it were lost, and the impact on cash flows if the data were
lost. This involves estimating direct costs (e.g., materials, labor for data
collection/cleaning) and indirect costs (e.g., financing, legal fees, infrastructure).
For older data, depreciation and inflation must be factored in to determine
replacement cost in today's terms.

o

Advantages:

This approach provides an objective measure based on

actual or estimated costs, making it useful for unique or specialized data assets
where market comparables are limited. It is less influenced by short-term
market fluctuations.

o

Disadvantages:

The CVI does not directly reflect the value derived from

the

use

of the data or its potential to generate future revenue or insights. It only

captures the cost of creation or recreation, not the economic utility or strategic
advantage it confers.


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

139

Dimensional Model:

Proposed by researchers at the Harvard Data

Science Review, this model expands on prior approaches by evaluating data
across multiple dimensions, including ownership, cost, utility, age, privacy, data
quality, and volume and variety. It aims to provide a more holistic view of data's
value by considering factors beyond direct financial metrics.

o

Challenges:

The value can vary considerably based on who uses the data

and for what purpose. Subjectivity in survey-based valuation and the evolving
nature of the model pose limitations. Translating the relative value determined
by this model into monetary terms often requires a secondary application of
market-based or economic models.

To illustrate the growing importance of data assets, a conceptual graph

comparing "Data Asset Value vs. Traditional Capital Value in selected firms"
would show a divergence over time. For leading digital firms, the proportion of
enterprise value attributable to data and other intangible assets would be
depicted as significantly increasing, often surpassing the value of their
traditional physical capital. This visual representation would underscore the
shift in corporate asset composition in the digital economy.

b) Case Studies: Data as Capital in Practice

Real-world examples demonstrate how data functions as a critical capital

asset, driving innovation and competitive advantage across diverse industries.

Big Tech (Google, Amazon):

o

Google:

Leverages big data analytics fundamentally to enhance its

search algorithms and drive product development. By continuously analyzing
user behavior, preferences, and trends, Google improves the relevance, accuracy,
and speed of search results, decoding user intent for a more tailored experience.
This data-driven approach also extends to targeted advertising, where user
search histories and behaviors are analyzed to customize advertisements,
optimizing platform efficiency and user experience. For product development,
Google uses data mining to identify new trends and user demands, guiding the
creation of innovations like Google Maps, Gmail, and Google Assistant.
Experiments with Google Search Ads, for example, have shown significant
improvements in impressions, clicks, and conversions, while decreasing cost per
conversion, all driven by data analysis.

o

Amazon:

Employs data science extensively for personalization, demand

forecasting, and inventory management to achieve record profits. Its
recommendation engine, a core component, analyzes browsing history,
purchase patterns, and product ratings to tailor suggestions, significantly


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

140

increasing sales likelihood. Features like "Customers who bought this also
bought" seamlessly integrate suggestions, guiding customers toward
complementary products and enhancing the shopping experience. In demand
forecasting, sophisticated algorithms and machine learning models analyze
historical sales data, customer behaviors, and external factors to predict future
demand, ensuring optimal inventory levels and reducing stockouts. This
proactive approach minimizes excess inventory costs and enhances operational
efficiency, contributing to Amazon's reputation for prompt delivery.

Digital Health (NHS Data Strategies):

o

Foresight AI (NHS England):

This project exemplifies how de-

identified NHS data is used to predict health outcomes and improve healthcare
efficiency and fairness. The Foresight AI model, trained on de-identified data
from approximately 57 million people in England, encompassing over 10 billion
healthcare events, functions like an auto-complete feature for medical timelines,
predicting future events based on past occurrences. This population-wide data
allows for the study of minority groups and rare diseases often overlooked in
smaller datasets. The collaboration aims to provide population health insights
for better planning and resource allocation, shift healthcare towards prevention
by predicting diseases early, enable precision medicine through tailored
treatments, and reduce health inequalities by ensuring inclusivity in predictive
analytics. Strict privacy measures ensure data remains within a Secure Data
Environment (SDE) and is only accessible to approved researchers.

Fintech:

Data-driven innovations are significantly enhancing banking

efficiency, security, and customer experience.

o

AI-Powered Chatbots and Virtual Assistants:

Tools like Bank of

America's "Erica" handle routine inquiries, provide account information, and
offer personalized financial advice. Erica has served over 19.5 million users and
handled over 230 million client requests, saving banks an estimated $7.3 billion
in operational costs by 2023.

o

Robotic Process Automation (RPA):

Automates rule-based tasks such

as loan applications and compliance checks. JPMorgan Chase's COIN platform
processes legal documents in seconds, a task that would take legal analysts
360,000 hours annually. RPA implementations have reported 25-50% cost
savings and 40-60% productivity improvements.

o

Advanced Fraud Detection Systems:

Machine learning algorithms

analyze vast transaction data in real-time to identify suspicious patterns.
Mastercard's Decision Intelligence, analyzing over 1.3 billion transactions daily,


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

141

has reduced false declines by 50% and increased fraud detection by 30%. These
systems identify 95% of fraudulent transactions while keeping false positive
rates below 1%, with cybercrime costing the financial sector over $40 billion
annually.

o

Predictive Analytics for Risk Management:

Fintech innovators use

machine learning algorithms to reduce default rates, with some models
improving risk scoring by up to 30% by incorporating data from over 2 million
transactions. Real-time data processing further strengthens these models,
enhancing decision-making efficiency by as much as 40%.

A conceptual table summarizing "Comparative case outcomes – investment

efficiency, productivity gains, revenue impact" across these case studies would
highlight the quantifiable benefits.

c) Macroeconomic Implications

The widespread integration of data as capital has profound macroeconomic

implications, affecting how economic activity is measured, the distribution of
wealth, and the dynamics of innovation.

GDP Measurement:

Traditional Gross Domestic Product (GDP) measures,

designed for an industrial economy based on tangible assets, increasingly fall
short in capturing the full scope of value created by intangible assets like data.
The "invisible" nature of data capital in traditional financial reporting
contributes to a significant measurement gap, potentially leading to an
underestimation of capital's true contribution to output and economic growth.
The 2025 System of National Accounts (SNA) is expected to expand the types of
intangible investment included in GDP to address this, but challenges remain in
valuing assets that lack market prices. Proposals for new metrics, such as a
Gross Domestic Knowledge Product (GDKP), aim to provide a more
comprehensive account of national wealth by quantifying the production and
circulation of knowledge, complementing rather than replacing GDP.

Inequality:

The concentration of data-driven productivity gains among a

small number of "Standout" firms, as observed in firm-level analysis, can
exacerbate economic inequality. The capacity to process, analyze, and monetize
data remains concentrated in Global North corporations, with lower-value work
like data labeling often outsourced to developing nations. This pattern of "data
colonialism" can deepen global inequalities, as corporate wealth consolidates in
the hands of a few, extracting data as a raw material without meaningful consent
or equitable compensation for the communities generating it. While data can
support greater efficiency and inclusion, it also carries the risk of being used for


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

142

price discrimination and feeding algorithmic biases, potentially disadvantaging
and excluding some individuals from important services.

Innovation Diffusion:

Data capital is a key driver of innovation, yet its

unique properties can affect how innovation diffuses across the economy. The
non-rivalrous nature of data, while enabling widespread use, can also lead to
market concentration, as dominant firms leverage their data assets to create
scale economies, similar to natural monopolies. This can stifle innovation by
reducing incentives for smaller players and leading monopolists to withhold or
diminish transformative innovations that might challenge their existing business
models. Policies focused on broad diffusion of technology might be insufficient,
necessitating approaches that address market power and promote open-source
initiatives to ensure wider access to data and algorithms.

A conceptual heat map illustrating "Global data infrastructure investment

vs. digital productivity growth" by region would likely show a strong positive
correlation. Regions with higher investments in digital infrastructure, such as
advanced economies, would exhibit higher digital productivity growth, reflecting

the foundational role of infrastructure in enabling data-driven economic activity.
This visual would highlight disparities in digital development across the globe.

A conceptual line chart depicting "National data capital stock vs. GDP per

capita over time" would illustrate the growing contribution of data to national
wealth. As countries develop their digital economies, an increasing proportion of
their capital stock would be attributed to data assets, and this growth would
generally correlate with an increase in GDP per capita. However, the chart might
also reveal the challenges in accurately measuring data capital, as official
statistics are still adapting to fully capture these intangible assets.


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

143

Discussion

The pervasive role of data as capital introduces significant challenges to

established economic systems and raises critical societal trade-offs.

Data Capital Challenges Traditional Accounting and Taxation Systems

Traditional accounting and taxation systems were designed for an economy

where the location of labor, the ownership of capital, and the monetary value of
income were readily identifiable. In contrast, in data-rich markets, the source,
ownership, and value of data are often difficult to identify and may not always be
economically meaningful concepts. This poses substantial barriers for
accountants in effectively integrating big data, impacting financial reporting,
auditing, and cost management. The intangible nature of data assets means they
are often not reflected on financial statements, leading to a significant share of a
company's value being unquantified by traditional methods.

For taxation, the global and fluid nature of data flows complicates the

application of existing tax models, which struggle to attribute value across
borders. The lack of clear, consistent data valuation standards means that tax
authorities may struggle to determine appropriate tax bases for data-driven
activities. This necessitates exploring new frameworks, such as a tax on data
collection and transmission, which would not depend on the monetary value of
data but rather on its volume or usage, potentially alleviating some of the
challenges posed by the data economy to democratic institutions. Furthermore,
rapidly changing tax requirements and the need for granular data pose
significant data collection issues for tax teams, requiring agile processes and
engagement with data owners across organizations.

Economic Trade-offs: Privacy vs. Productivity, Monopolization vs.

Innovation

The data economy presents inherent economic trade-offs that policymakers

must navigate.

Privacy vs. Productivity:

The protection or sharing of personal data

generates trade-offs with tangible economic dimensions. While extensive data
collection and sharing can fuel innovation, increase productivity, and enhance
services, for example, through personalized recommendations or predictive
healthcare, it also raises significant privacy concerns. Consumers' ability to
make informed decisions about their privacy is often hindered by imperfect
information regarding data collection, purposes, and consequences. Overly
restrictive privacy laws, while intended to protect individuals, can negatively
affect data value, discourage competition, and lead companies to hoard data,


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

144

potentially outweighing their intended benefits. Conversely, a lack of privacy
protection can lead to economic leverage by data holders, price discrimination,
and other forms of misuse, eroding trust in the digital economy. The challenge
lies in balancing these competing interests to foster trust and ensure active
participation in the digital economy.

Monopolization vs. Innovation:

Data's unique economic characteristics,

particularly its non-rivalry and high upfront/low marginal costs, can lead to
natural monopolies or oligopolies in data-intensive industries. When a few
companies dominate, controlling market share, data flow, and computing power,
the motivation to innovate can fade. History shows that monopolization can lead
to stagnation, as dominant firms prioritize protecting their turf over pioneering
new ideas. They may restrict the release of transformative innovations or yoke
them to existing products to preserve their market structure. These dynamic
highlights the need for stronger antitrust regulations to maintain fair
competition and ensure that new and smaller companies have opportunities to
thrive. Promoting open-source and decentralized innovation can also make data
and AI development more accessible, preventing concentration of power and
fostering broader progress.

Data Ownership, Access Inequality, and Digital Colonialism

The rapid accumulation and monetization of data have brought to the

forefront issues of data ownership, access inequality, and the phenomenon of
"digital colonialism." The concept of "data colonialism" posits that tech
companies are expropriating personal and social data as a raw material, often
without meaningful consent or equitable compensation, mirroring historical
processes of colonization and primitive accumulation. This "data grab"
transforms behaviors, preferences, and emotions into economic inputs,
enclosing and commodifying the digital commons.

A stark divide exists between data production and value capture. While

data is generated globally, the capacity to process, analyze, and monetize it
remains concentrated in corporations primarily in the Global North. This
exacerbates global inequalities, as these companies extract data from the Global
South to develop products that ultimately dominate local industries, with
minimal benefit flowing back to the data-providing communities. Examples
include Google's DeepMind Health partnership in India, raising questions about
data ownership and affordable access to resulting technologies.

The "data as labor" framework proposes recognizing users who create data

as workers deserving compensation, which could make the economic value


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

145

users generate visible and potentially reduce overcollection if data becomes a
cost to platforms. However, this framework risks normalizing the
commodification of social life. The current model often involves individuals
receiving digital services as payment for their data, a form of "partial data
barter".

These issues underscore the need for institutional innovation to ensure

more equitable and ethical data governance.

Propose Ideas for Data Trusts, Public Data Commons, and Ethical

Valuation Models

To address the challenges of data ownership, access inequality, and digital

colonialism, several innovative governance and valuation models are being
explored:

Data Trusts:

These legal entities would hold and manage data on behalf of

a group of individuals or for a specific public purpose, asserting collective
decision-making power over training data for advanced AI systems. A data trust
could scrape the internet as a digital commons and license data to commercial
model developers for a percentage cut of revenues, ensuring redistribution of
economic value and accounting for negative externalities like unemployment
from automation. This model aims to empower data subjects and ensure more
equitable benefit sharing.

Public Data Commons:

This approach involves pooling and sharing data

as a common resource, addressing power imbalances and promoting community
ownership and leadership for a public cause. Treating data like essential public
goods, such as clean water, could lead to dramatic improvements in public
health, productivity, and equity. Common access to public interest data,
particularly in health, mobility, and education, could foster widespread
innovation. However, significant governance challenges remain, including
managing sensitive data, determining oversight, and ensuring quality control.

Ethical Valuation Models:

Beyond purely financial metrics, ethical

valuation models would integrate societal values and impacts into the
assessment of data's worth. This involves considering factors like privacy,
fairness, and the potential for algorithmic bias, ensuring that data monetization
does not come at the expense of social welfare. Such models would aim to
balance innovation with public good, moving beyond a narrow focus on profit to
encompass broader societal benefits. Developing clear value propositions and
ensuring trustworthiness through transparent policies, technical safeguards,


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

146

and accountability measures are crucial for successful ethical data sharing
initiatives.

A conceptual flow diagram illustrating a "Framework for valuing data as

capital across economic layers (firm, sector, nation)" would begin at the micro-
level (firm), showing how data is collected, processed, and used to generate
internal value (e.g., efficiency gains, new products). This value then aggregates
to the sector level, where data-driven innovation and productivity
improvements contribute to industry growth. Finally, at the national level, the
collective impact of data capital on GDP, labor productivity, and overall
economic welfare would be depicted, while also highlighting the measurement
challenges and the need for new accounting standards to capture this value
comprehensively.

Conclusion

Data has unequivocally emerged as a new engine of economic growth,

fundamentally reshaping industries and national economies. Its unique
characteristics—non-rivalry, non-fungibility, and its nature as an experience
good—distinguish it from traditional forms of capital and are central to its
immense value-generating potential. The rise of the data economy, propelled by
platforms, IoT, and AI ecosystems, has created a symbiotic relationship where
more data fuels more sophisticated AI, which in turn extracts greater value from
data, accelerating economic transformation.

However, the full realization of data's potential is hampered by significant

challenges. Traditional accounting and taxation systems struggle to measure,
value, and regulate data effectively, leading to an "invisible" contribution to GDP
and an incomplete picture of national wealth. The concentration of data-driven
productivity gains among a few "Standout" firms raises concerns about
economic inequality and market monopolization, while the imperative to
balance data sharing for productivity with individual privacy rights presents
complex trade-offs. The phenomenon of "digital colonialism" highlights the
uneven distribution of value capture from data, exacerbating global disparities.

Addressing these multifaceted challenges requires bold institutional

innovation in data governance and valuation. The development of data trusts
and public data commons offers promising avenues for more equitable data
ownership and access, fostering collective benefit while mitigating risks. Ethical
valuation models are essential to ensure that data monetization aligns with
broader societal welfare, moving beyond purely financial metrics.


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

147

This era demands interdisciplinary approaches, drawing expertise from

economics, law, technology, and social sciences, to design robust frameworks
that unlock data's full potential while safeguarding individual rights and
promoting inclusive growth.

Continued research is vital to refine the understanding and management of

data as capital:

Integrating Real-Time Data into National Statistics:

There is a growing

opportunity to leverage high-frequency and real-time data sources (e.g., web
scraping, mobile-phone records, banking transactions) to produce more timely,
detailed, and accurate official statistics, particularly for GDP measurement.
While traditional statistical surveys are often infrequent and prone to delays, big
data offers the potential for continuous, near-real-time insights. However,
significant challenges remain, including the inherent volatility and selectivity of
big data sources, which can lead to inexplicable jumps in time-series data. Many
big data sets also lack linking variables, limiting their integration with other
datasets or population frames, which can impede comprehensive analysis and
correction for biases. Furthermore, ensuring the quality, consistency, and
comparability of these novel data streams with established national accounts
requires significant methodological and technological investment, as well as
overcoming legal and privacy concerns. Future research should focus on
developing robust methodologies for data integration, addressing data quality
issues, and establishing clear governance frameworks for the use of big data in
official statistics.

Modeling "Data Dividends":

The concept of a "data dividend" proposes a new

social contract where individuals hold an equity stake in the data they create,
recognizing them as co-producers of economic value. This idea, rooted in
principles of capitalism adapted to the digital age, suggests that a portion of the
profits generated from data could be redistributed to individuals. Future
research could explore various models for implementing data dividends,
including micro-payment systems, potentially leveraging blockchain technology,
or tiered reward programs that incentivize ethical data sharing. This would
involve modeling the economic impact of such redistribution on consumer
behavior, firm investment, and overall economic inequality. Challenges include
defining the "value" of individual data for dividend purposes, establishing
transparent data use agreements, and navigating the complexities of collective
bargaining for data through "data unions". Research should also assess the
financial sustainability of such policies, considering their long-term effects on


background image

SOLUTION OF SOCIAL PROBLEMS IN

MANAGEMENT AND ECONOMY

International scientific-online conference

148

corporate growth strategies and market dynamics. This area of inquiry holds
potential for reimagining the economic contract between individuals and
institutions, fostering more equitable value sharing in the data-driven economy.

References:

1.

Acemoglu, D., & Restrepo, P. (2019). Artificial Intelligence, Automation,

and Work. NBER Working Paper No. 24196.
2.

Brynjolfsson, E., Rock, D., & Syverson, C. (2017). Artificial Intelligence and

the Modern Productivity Paradox: A Clash of Expectations and Statistics. NBER
Working Paper No. 24001.
3.

Coyle, D., & Manzia, P. (2020). Modernizing Economic Statistics: Data as a

Strategic Asset. Bennett Institute for Public Policy.
4.

Organization for Economic Co-operation and Development (OECD).

(2019). Going Digital: Shaping Policies, Improving Lives.
5.

Organization for Economic Co-operation and Development (OECD).

(2021). Enhancing Access to and Sharing of Data: Reconciling Risks and Benefits
for Data Re-use across Societies.
6.

Wang, Y., Kung, L., & Byrd, T. A. (2018). Big data analytics: Understanding

its capabilities and potential benefits for healthcare organizations. Technological
Forecasting and Social Change, 126, 3–13.
7.

Google Cloud. (2023). Data Asset Value Framework.

Bibliografik manbalar

Acemoglu, D., & Restrepo, P. (2019). Artificial Intelligence, Automation, and Work. NBER Working Paper No. 24196.

Brynjolfsson, E., Rock, D., & Syverson, C. (2017). Artificial Intelligence and the Modern Productivity Paradox: A Clash of Expectations and Statistics. NBER Working Paper No. 24001.

Coyle, D., & Manzia, P. (2020). Modernizing Economic Statistics: Data as a Strategic Asset. Bennett Institute for Public Policy.

Organization for Economic Co-operation and Development (OECD). (2019). Going Digital: Shaping Policies, Improving Lives.

Organization for Economic Co-operation and Development (OECD). (2021). Enhancing Access to and Sharing of Data: Reconciling Risks and Benefits for Data Re-use across Societies.

Wang, Y., Kung, L., & Byrd, T. A. (2018). Big data analytics: Understanding its capabilities and potential benefits for healthcare organizations. Technological Forecasting and Social Change, 126, 3–13.

Google Cloud. (2023). Data Asset Value Framework.