How to Create an Effective Code of Data Ethics

Digital businesses are accustomed to thinking about, and dealing with, digital risks that come from outside their organizations. Cybersecurity threats from hackers are the most typical example. Increasingly, however, digital companies are being called on to confront another kind of risk, one that cybersecurity frameworks alone can’t address.

businessThese are the risks that may arise from a company’s failure to adhere to ethical data practices. As digital trust becomes an ever more important issue for consumers, companies must increase their commitments to responsibly and ethically handle the massive amounts of consumer data they collect.

Such responsibility goes beyond simply ensuring that data privacy is protected. Instead, it encompasses all kinds of issues relating to analyzing data and acting on the insights it provides. This includes using data for any purpose that runs contrary to the disclosers’ consent, or using data insights to amplify harmful or prejudicial biases.

Due to the complexity of these issues, it’s not surprising that will many companies don’t know where to begin when it comes to creating policy frameworks for ethical data use.  Pre-existing codes of conduct are often hopelessly outdated and provide no useful insight on how to handle the central role that consumer data has come to occupy in the digital economy.

To help address this gap, the strategic consulting firm Accenture has developed a set of 12 principles that digital businesses can use as guidelines for creating an effective, company-specific code of data ethics. These data-centric principles include:

  1. Respecting the people behind the data is the highest priority.

Insights derived from data can be compelling, but can also cause harm to individuals and communities. Companies must always be aware of and take into account the real-life, human side of big data.

  1. What is done with datasets is even more important than what data is collected and how.

Companies and data professionals must ensure that regardless of data status – public, private, or proprietary, for example – the downstream use of datasets is consistent with the disclosers’ intentions and understanding.

  1. There is no such thing as raw data.

Companies must understand that the data and analytical tools we use today are shaped by how they been used in the past. They also arise from a complex history of human decision-making. To help further guide the development of these tools in the future, this history should be auditable.

That means that companies should prioritize the creation of mechanisms that track things like the context in which data is collected and the methods by which consent is given. All of these factors will impact the data itself.

  1. Safeguards should be matched with expectations when it comes to privacy and security.

When parties disclose data, their expectations about how privacy and security will be handled are usually very context-specific. Companies should take these expectations into account and ensure that the provided safeguards meet or exceed them.

  1. The law must be followed, but it is often the minimum standard.

Due to the pace of digital development, by the time technologies are implemented, regulations pertaining to them are often already outdated. Digital leaders should therefore recognize that complying with existing legislation is the bare minimum standard, rather than an acceptable one. They should thus be prepared to anticipate and outpace legal requirements.

  1. Always have a reason for the data you collect.

Gathering data simply for the sake of having more data could exacerbate the risk that it will be used in the future for unpredictable purposes. Businesses should carefully consider the idea that collecting less data may be a better way to achieve more accurate analysis and minimize risk.

  1. Recognize that data can be used to include – and to exclude.

Data use is not “one size fits all.” It’s important for businesses to recognize that different people, groups, and communities will be impacted differently by data collection, correlation, and prediction. Taking this into consideration will help create a level playing field for digital culture in which everyone benefits equally from data’s social and economic advantages.

  1. Data disclosers should understand analysis and marketing methods.

As much as possible, companies should provide data disclosers with the information necessary for them to understand what will happen to their data. Being as transparent as possible at the data collection stage helps reduce ethical risks later on.

  1. Accurate qualifications, professional standards, and peer accountability are vital.

In order for data science to succeed as a discipline in the long term, public and client trust is vital. This means that all practitioners should strive to develop and adhere to best practices to demonstrate their accountability and commitment to shared, high standards.

  1. Transparency, configurability, accountability, and auditability are key elements of good design.

Design practices are a valuable tool in helping overcome critical barriers to shared, robust ethical standards. Incorporating the qualities above into data engineering can be an important contribution to solving ethical dilemmas that may arise down the road.

  1. Internal ethical reviews are a must for products and research practices.

New products, services, and research programs should be subject to consistent, well-organized, and legally accountable ethics review processes. Internal processes are essential, and peer-review or external processes can significantly develop public trust.

  1. Governance practices are a priority.

Ethical governance depends on shared, predictable, and clear practices. Digital companies should ensure that all team members know and understand these practices, and that they are regularly reviewed for robustness and accountability.

Advertisements