Quiz answers

QUIZ 1

  1. All radical opinions are disinformation. NO – There are several types of radical opinions that are not disinformation, see: SAUFEX blog post 8
  2. I am able to see the world without bias. NO – Thinking that you can see the world without bias is called “naive realism” – a bias
  3. People always act on their strong opinions. NO – According to Dan Sperber some beliefs, even strong ones, are “reflective”: beliefs with little or no consequences for a person’s behavior, see: SAUFEX blog post 8
  4. People love to be right. YES – This is the famous confirmation bias
  5. Radical people are consistent. NO – See: Festinger’s theory on cognitive dissonance, as well as SAUFEX blog post 10
  6. Opinions based on facts are more valuable than opinions based on beliefs. NO – Belief-speaking and fact-speaking are both needed in a productive democratic debate according to Lewandowsky, see for instance: SAUFEX blog post 19
  7. Misinformation is free speech. YES – See: J.S. Mill’s On liberty. ‘Misinformation’ is not mentioned in the DSA, while ‘disinformation’ is mentioned 13 times – often in the context of being a societal risk and of its dissemination being a societal risk.
  8. For people belonging is more important than the truth. YES – See: Tajfel and Turner’s Social Identity Theory
  9. We are all hypocrites. YES – See: R.Kurzban’s Why everyone (else) is a hypocrite
  10. It is normal to dislike people who are different. NO – See: Van Bavel & Packer’s The power of us

QUIZ 2

  1. We should call disinformation campaigns ‘FIMI’ only after we have established who is responsible. YES – Since it is hard to establish who is responsible for disinformation campaigns, it is unclear whether those responsible are foreign – and it would be preliminary to call disinformation campaigns ‘FIMI’, for this would unjustly suggest knowledge on who is responsible; see for difficulties in attribution, for instance, the 2020 publication The Global Disinformation Order: “While many countries have seen an increase in computational propaganda on social media, attribution back to a particular actor remains difficult.”
  2. It is easy to establish intent behind ‘disinformation’ campaigns. NO – In order to establish intent, one needs to be abel to attribute first. Since it is hard to attribute – see above – establishing intent is not easy.
  3. FIMI campaigns are sometimes orchestrated by non-state actors. YES – The EEAS defines FIMI as “a pattern of behaviour that threatens or has the potential to negatively impact values, procedures and political processes. Such activity is manipulative in character, conducted in an intentional and coordinated manner. Actors of such activity can be state or non-state actors, including their proxies inside and outside of their own territory”.
  4. The DSA clearly states what illegal and “awful but lawful” content is. NO – The European Commission explains: “Does the Digital Services Act define what is illegal online? No. /…/ What constitutes illegal content is defined in other laws either at EU level or at national level /…/ Where a content is illegal only in a given Member State, as a general rule it should only be removed in the territory where it is illegal.” ‘Awful but lawful’ is not mentioned in the DSA.
  5. Fact-checking is the most effective intervention against FIMI. NO – The 2023 publication Using Psychological Science to Understand and Fight Health Misinformation, drafted by many of the best and most known experts in the field, states: “Fact-checking interventions, which are designed to detect and tag falsehoods, tend to be highly effective outside of political domains but not very effective within them (e.g., Walter et al., 2020), because people share misinformation for reasons beyond whether they believe it to be true.” The publication lists fact-checking strengths and limitations, and recommendations on how to increase its effectiveness. The publication does not list one intervention type as most effective.
  6. It is acceptable to use AI to generate policy-making recommendations. NO – Not when the recommendations concern us humans, see: SAUFEX blog post 35
  7. There is always a less radical alternative for denial of services interventions. YES – This is one of the premises of the SAUFEX project, see: SAUFEX blog post 1
  8. Interventions against FIMI should primarily defend democratic processes and institutions. NO – It is tempting for states to defend their institutions and procedures primarily, but another project SAUFEX’s premise is that interventions against FIMI should primarily defend citizens, see the Epilogue of R.Kupiecki’s and T.Chłoń’s publication Towards FIMI Resilience Council in Poland (to be published on the SAUFEX website soon) and SAUFEX blog post 4
  9. Hybrid threats are not necessarily accompanied by disinformation/ FIMI campaigns. YES – Since hybrid threats are not always accompanied by any information campaign, sometimes incidents caused by natural causes are wrongly attributed to malign foreign activities (see for instance: Hybrid threats: Russia’s shadow war escalates across Europe)
  10. Participatory institutions like the Resilience Council should base their recommendations on facts only. NO – There is a philosophical gap between descriptions (what “is”) and prescriptions (what “ought”): what “is” does not suffice to conclude what “ought” to be done – see: SAUFEX blog post 19