Feb 2 / Naz

How much can we trust to the "Black Box" AI Solutions?

I'm a big fan of technology, but when it comes to personal or confidential information, I'm wary of using AI notetaking and similar tools. The privacy concerns, both legal and ethical, are significant when using AI solutions for personal tasks.
In our daily interactions, whether we're talking, writing, drawing, taking photos, making videos, or just moving, AI solutions can record our actions depending on their use
The problem is that we, as regular users, often aren't aware of the implications. We don't know how the data collected by AI apps or solutions is used by their providers, nor are we certain if these tools meet ethical and quality standards.
For instance, imagine if the health data we collect through our smartphone apps were shared with health insurance companies or employers, or if conversations with our digital assistants were collected and shared for creating inappropriate content using our voices or deep-fake images.
When buying a device, we thoroughly check its safety and quality. However, we don't apply the same level of diligence to AI solutions, which are arguably more critical. In the case of a device, we can personally verify its safety and check for any signs of it being stolen or illegally acquired.
With AI, the situation is quite different. We often have no insight into what's inside the 'black box' - how the AI is developed, whether the data used for training it was obtained ethically, or what exactly happens to our data once we input it. All we see is the output, and we can never be completely certain about its accuracy or reliability."
In summary, while we can assess and understand the risks of physical objects like cars, AI solutions present a more complex challenge due to their opaque nature and the uncertainty about how our data is processed and used.
Let's stay conscious of the risks involved in using 'Black Box' AI solutions. It's advisable to opt for those that are open-sourced and comply with ethical and quality standards.
Created with