EMFSA added notes as per the video:
Human values should be central in the development of AI Human.
Lesson from the Frankenstein story: you should not leave your technologies to their own devices. You should take responsibility for not only making the technology but also for introducing it into society. For AI we develop technologies that do things that are beyond our field of influence. We cannot tell them what to do – they learn, they develop patterns of interaction that cannot be reduced to what we put into them. The only way to deal with these systems is to educate them and to hold themselves accountable for what they do. If we do not do that we are in the same situation as with the Frankenstein monster – leaving them to themselves – we made a beautiful thing but then we become afraid that they might take over the world. But then it is too late – we should do it now. We should learn to educate the algorithms before they take over. In a lot of discussions we still separate ethics and humans from technologies and the labs where we make them. Ethics and technology are often kept very far apart.