Fairness in algorithmic systems is a multi-faceted, and developing, topic. As technology advances...
Balancing Security and Access for increased algorithmic integrity
When we talk about security in algorithmic systems, it's easy to focus solely on keeping the bad guys out.
But there's another side to this coin that's just as important: making sure the right people can get in.
This article aims to explain how security and access work together for better algorithm integrity.
Listen to the audio (human) version of this article - Episode 6 of Algorithm Integrity Matters
Why Does This Balance Matter?
Let’s break it down.
Keeping Bad Actors Out
It's obvious why we need to prevent unauthorized access.
Bad actors could:
- Steal Sensitive Data
- Manipulate Algorithms for Their Benefit
- Cause System Failures or Disruptions
These can lead to financial losses, reputational damage, and even legal consequences.
Robust security measures are a must.
Ensuring Necessary Access
Here's where it gets tricky.
While we're busy building digital fortresses, we need to make sure we're not locking out the good guys.
Prioritize access over security, and you're leaving the door open for potential breaches and misuse.
If you lean too far, and don’t give people the access they need, you can’t effectively ensure integrity.
This creates a paradoxical situation where overzealous security measures actually create/increase risk.
Here's why, with reference to the 10 key aspects of algorithm integrity from a previous article:
Key aspect of algorithm integrity |
Risk of not providing access |
Ref. |
1. Accuracy and robustness |
Limited Oversight |
A |
2. Alignment with objectives |
Impaired Decision Making |
B |
3. Fairness (incl. impact assessments) |
Limited Oversight |
A |
4. Transparency and explainability |
Reduced Transparency |
D |
5. Security |
Workarounds and Shadow IT |
C |
6. Privacy |
Workarounds and Shadow IT |
C |
7. Governance, Accountability and Auditability |
Reduced Transparency |
D |
8. Risk Management |
Missed Early Warnings |
E |
9. Ethics and Training |
Limited Oversight |
A |
10. Compliance |
Incomplete Audits |
F |
- Limited Oversight: teams responsible for governing, monitoring and controlling algorithmic systems need access to perform their duties effectively. Without it, they're flying blind.
- Impaired Decision Making: Leaders and managers need access to make informed decisions about system development and deployment. Limited access can lead to decisions based on incomplete information.
- Workarounds and Shadow IT: when users can't access the data they need, they might use unsanctioned tools or copy data to other environments, inadvertently creating new security risks.
- Reduced Transparency: overly restrictive access can make it difficult to efficiently provide transparency to executives, regulators or other stakeholders.
- Missed Early Warnings: small irregularities can be early indicators of larger problems. These subtle signs might go unnoticed until they escalate.
- Incomplete Audits: auditors need full visibility.
Security and access are complementary
Robust security is crucial, but it must be balanced with the need for oversight and control.
The goal should be to create a secure algorithmic system that still allows for the necessary visibility and access to maintain integrity.
Ensuring that the right people have the right access reduces risk. We want security measures that don't hinder legitimate work, and access that doesn't compromise security.
By getting it right, we enhance algorithmic integrity.
Disclaimer: The information in this article does not constitute legal advice. It may not be relevant to your circumstances. It may not be appropriate for high-risk use cases (e.g., as outlined in The Artificial Intelligence Act - Regulation (EU) 2024/1689, a.k.a. the EU AI Act). It was written for consideration in certain algorithmic contexts within banks and insurance companies, may not apply to other contexts, and may not be relevant to other types of organizations.