AI and hardware production: Two forgotten aspects?

Dear all,

I am not sure if I just missed something in the beginning of the digital principles. Thus it may be possible that the following is off-topic, discussed already, and not relevant any more. If this is the case, please excuse me!

I wonder if the two questions below shall get some attention within the principles. The idea behind this is the holistic view we need for every single project in order to meet the SDGs.

  1. How can we “do no harm” when buying hardware/electronics (of any kind)?
  2. How can we prevent AI from reinforcing/empowering discrimination? (An introduction can be found here, download the comic essay here

I’ve read “By using the Digital Principles, organizations can develop more effective and sustainable solutions and move one step closer to achieving the SDGs” here:

I agree, and want to connect my above two questions to both, the SDG and the DP.
“Build for Sustainability” should involve all three dimensions of sustainability: social, environmental, and economic. It can not simply mean that you “want your project to last years and years after it starts” (
We have to care, don’t we?

Ad 1.
What do we know about the impact of the hardware production on social, environmental/nature and economic capital? Do we support decent work (SDG 8), reduce the consumption of resources, utilities and energy (SDG 8, 9, 11, 12), avoid waste, waste water, and pollutant emissions (SDG 3, 6, 11, 12) with procurement of equipment? Does our purchasing decision help end poverty (SDG 1) or reach gender equality (SDG 5)? Do we know the supply chain of the hardware products needed for our software solution? Does ist help to reduce all forms of violence (SDG 16) or are raw materials from conflict regions (conflict minerals) used? Do producers at all levels respect safety and good working conditions (ILO labour standards Or do we directly support child labour and forced labour with purchasing hardware? Can we usilize the power of the consumer?
(More information can be found here: ILO’s Sectoral Studies on Decent Work in Global Supply Chains
Child labour
“Design with the user” aims at “using fewer resource”. Can the solution be adapted to the hardware that is there already? Does “Reuse and Improve” also aim at reusing and further using hardware?

Ad 2.
Are people like you and me empowered to discuss and decide on heuristics, algorithms, ground truths, and learning data (input data and demanded output values) used in machine learning and deep learning settings? In other words: are we able to control AI and make sure it does not multiply discrimination? This is not a matter of professional knowledge, it can be done done by “you and me”, ordinary people. But who cares for it? Who translates “algorithm and data” to natural language and vice versa?
“Design with the user” aims at “Develop context-appropriate solutions informed by users’ priorities and needs”. Not being discriminated is a general need, I guess. The same principle encourages to “Embrace an iterative process that allows for incorporating feedback and adapting your solution” which is especially important when using data to train AI.

I have no concrete idea where my questions should lead, but I feel like these two are important enough to be discussed.

And I think of those two posts, too.
Using less resources: OT? Reuse of older hardware
Ethics: Ethics Consideration for Digital Development: How can digital professionals uphold ethical principles in their work?

Take care!


I think you are absolutely right to highlight these concerns. There are a few individual projects who are trying to tackle and discuss some of the issues in this space (e.g. the network), the Basel Action Network (, but it is far from being embedded in the culture of development practice or government priorities. We need to keep raising these questions and holding each other to account.

Also, energy consumption is an aspect.
“The energy consumption of Information and Communication Technologies (ICT) is increasing by 9% every year. It is possible to limit this growth to 1.5% per year by moving to sober digital practices.”