26/04/2024
HumourInternet SecurityNewsfeedOpinionReviewSecurity

About Ethical Algorithmic Implementations

In our increasingly digital world, algorithms are shaping our lives in ways that are both visible and invisible. They influence the news we consume, the products we purchase, and even the people we interact with. While algorithms can be incredibly beneficial, they also raise important ethical questions.

I define the term ethics as the study of how to organize the world to make it most harmonious, and how to ensure its correct order and well-being in all aspects of life. As new technology keeps on emerging and evolving, it is essential to ensure we fully understand the advantages and disadvantages of the kind of technology that we are using. With these trending technologies like Artificial Intelligence (AI) and Machine Learning (ML) embedded in our devices, they learn and adapt to our ways of living, feeling and thinking, like earlier stated they are capable of influencing or rather making decisions on our behalf, choosing how we feel, what we like to consume or purchase and even the kind of people, organizations that we interact with. Now these algorithms are like a set of instructions that are used to solve problems or perform certain tasks based on their alternatives to produce a desired outcome.

Algorithms are applied everywhere and are embedded in every digital device. These technologies and algorithms are dominated by two types of people: The people who understand what they do not manage and those who do not understand. Technology is a gift and a powerful tool, that if not used correctly might lead to ethical issues. It is for this reason that I decided to write this article about “Ethical Algorithmic Implementations”. I define Ethical Algorithmic Implementations as the approach that is used by software/system developers and designers when implementing these algorithms to ensure that they do not break any legal or moral rules.

These algorithms have a huge impact on privacy, accuracy, property, accessibility, and effects on the quality of life, hence the emphasis for System/Software engineers and designers to prioritize and work on the antisocial behavior of these algorithms. Algorithms can sometimes be very biased and unfair in their decision-making, leading to privacy and security issues. Even though these algorithms are designed for specific purposes by their designers and engineers, they tend to somewhat tend to be the opposite in either the moral or ethical purpose they were designed for.

With reference to (CODED BIAS) a documentary that premiered on the 22nd of March 2021, this documentary is about the increasingly data-driven, automated world, the question of how to protect individual civil liberties in the face of artificial intelligence looms larger by the day. Coded Bias follows M.I.T. Media Lab computer scientist Joy Buolamwini, along with data scientists, mathematicians, and watchdog groups from all over the world, as they fight to expose the discrimination within algorithms now prevalent across all spheres of daily life.

While writing this article, I came across a term called persuasive technology “A technology that is designed to influence, and change attitudes and behaviours of users through persuasion and social influence. Persuasive technology works just like these bias algorithms, if measures are not put in place to curb this issue, privacy problems will keep on rising. If these algorithms go biased, they have the power to treat people and either deny or grant people access based on their demographics.

But with the power of digital literacy, we can all work together and help fight the persuasive technology and biased algorithms by providing individuals with the resources that will help them understand how the internet and these technologies work in order for them to have technological freedom as they access these technologies and the internet. Let us all be mindful of how we use the internet and these technologies, as we are to have more technological freedom.

Leave a Reply

Your email address will not be published. Required fields are marked *