Roundtable discussion on impact of digital media in misinformation & content moderation; Determination of ICASA councillors’ salaries and allowances

This premium content has been made freely available

Communications and Digital Technologies

25 May 2021
Chairperson: Mr B Maneli (ANC)
Share this page:

Meeting Summary

Video: Portfolio Committee on Communications

Tabled Committee Reports

The Committee received a briefing in response to a letter received from President Ramaphosa which requested the Assembly to consider the draft notice of his determination on the salaries and allowances of councillors of the Independent Communications Authority of South Africa (ICASA). It also conducted a roundtable discussion in a virtual meeting on the impact of digital media on misinformation and content moderation.

The Committee was told that the ICASA chairperson and councillors earned R1.5 million per annum, and the President had requested a salary freeze for the 2020/21 financial year. It was pointed out that the budget for ICASA had been reduced due to the National Treasury’s reprioritisation of funds, as well as the economic challenges which faced the country.

Members said there were “solid and reasonable” grounds for the recommendation, and while acknowledging that the task of the councillors was difficult, felt that the Committee should support the President. The Democratic Alliance reserved its right to vote on the matter pending caucus approval.

Pygma Consulting presented on the role of social media in civil society, and Google SA presented on misinformation and content moderation. The Committee heard that Google had invested resources into the continent to deal with misinformation and content moderation, and also worked with political parties and users in order to provide access to authoritative information, support campaigns and monitor misinformation. It had partnered with the Independent Electoral Commission (IEC), the Code for Africa and Real 411, a platform for reporting digital disinformation.

The roundtable discussion focused on several key areas. These included the employment of local engineers and continent moderators; measures to ensure privacy; Google SA’s compliance with key legislation, such as the Protection of Personal Information Act and local tax regulations; control of advertising content; child safety online; digital skills training; and fake news regarding the 2021 and 2024 South African elections, Covid-19 and vaccinations.

Meeting report

The Chairperson greeted the Members, presenters and South Africans who watched on various platforms. He dealt with the formalities of virtual meeting rules.

Apologies were given for the Minister of Communications, Ms Stella Ndabeni-Abrahams, and the Deputy Minister, Ms Pinky Kekana, who was at a Home Affairs Portfolio Committee meeting. Apologies were given for Ms Nonkqubela Jordan-Dyani, Acting Director-General for the Department of Communications and Digital Technologies (DCDT), who had a family emergency, as well as for Mr N Kwankwa (UDM).

For the discussion on Google, Members of the Select Committee on Public Enterprises and Communications were excused, as they were in a National Council of Provinces (NCOP) meeting. In addition, the representative of TikTok and the Chairperson of the Universal Service and Access Agency of South Africa (USAASA) could not attend.

He recognised that it was Africa Day, and mentioned the importance of commitments made to freedom, peace and development in Africa in order for it to play a role in global politics. He raised the point that support must be given to Palestine, and advocated for communication and connectivity.

The Chairperson referred to a letter received from Ms Phumzile van Damme, expressing her gratitude to the Committee, as she had resigned.

Determination of salaries and allowances of ICASA councillors

Mr Mbombo Maleka, Committee Content Advisor, gave the presentation on the determination of the salaries and allowances of the councillors of the Independent Communications Authority of South Africa (ICASA), as requested by the President in a letter date 26 March 2020.

There had been a briefing of the Committee on 8 July 2020 by the Department. He said that the current remuneration of ICASA councillors had been determined by the Minister of Communications in 2008. Since then, the budget for ICASA had been reduced due to National Treasury’s reprioritisation of funds, and the country had been faced with serious economic challenges.

The ICASA chairperson and councillors earned above R1.5 million per annum, and the President had determined a salary freeze (0%) for all public office bearers for the 2020/21financial year.

(See presentation for information regarding the legislation)

Discussion

Mr Z Mbhele (DA) said that there were solid and reasonable grounds for the recommendation received from the Presidency. However, he asked for the minutes of the meeting to reflect that the Democratic Alliance (DA) reserved their right to vote on this matter, as caucus approval was needed.

Ms P Faku (ANC) hoped that the office bearers and councillors understood the position of government regarding the effects of Covid-19 and the state of recession of the country. The Committee understood that the task of the councillors was difficult, but the Committee had to support the proposal from the Presidency.

Ms N Khubeka (ANC) said that the Committee was “fully aware” of the letter. The Committee would want to give support, but the issue of economic recovery and the South Africa economy was still a challenge. (She lost connection and was inaudible).

The Chairperson highlighted Mr Mbhele’s point, and noted that this was not an objection to the item, but rather a record to show that the DA had reserved their right on the vote of the matter once it got to the House and that the matter was agreed on.

He asked to move to the next item, and Ms Faku agreed.

Chairperson's opening remarks

The Chairperson said that there had been last minute withdrawals by Facebook and other entities from the roundtable discussion. However, the Committee would hope to engage with these entities in the future. He thanked Google SA for honoring their commitment, and said that it was a discussion rather than an enquiry. He hoped that Google SA would highlight what they were doing about misinformation, particularity leading up the elections in 2021 and 2024, and allow South Africans to understand what Google was doing to ensure the information was legitimate, as this would help them to make their minds up and deal with fake news, democracy and social media.

He said the facilitator for the roundtable discussion would not be from the Committee, to ensure that all Members could engage freely.

PYGMA Consulting

Ms Thabisa Faye, Managing Director, Pygma Consulting, presented on the role of social media in civil society, social media in South Africa, regulations and legislation, and content moderation and misinformation.

She noted how social media and digital media were fast-paced and data driven, and had the ability to influence votes, misinform the public and spread hate speech. However, it could increase political influence and lower the cost of disseminating information.

There were 38.19 million internet users (on any device) in South Africa, with Google, YouTube and Facebook being the top websites. 25 million people (41.1% of the population) were active social media users.

She concluded by stating that it was important to find the balance between the promotion of freedom of speech and content moderation, as well as the impact content had on economics, financial trades, and politics and social justice.

Misinformation and “fake news” could impact society, children, democracy and the economy. 

Google SA

Dr Alistair Mokoena, Country Director, Google SA, presented on misinformation and content moderation. Over $1 billion was invested annually into content moderation, and there were over 20 000 members with diverse backgrounds, languages and expertise who moderated the content.

In order to support information quality, Google SA relied on the four Rs: remove, raise, reduce and reward. (See presentation for further details)

In terms of elections and politics, Google helped voters to gain access to authoritative information, supported campaigns and elected officials on how to effectively use Google products to reach voters, and monitored and disrupted various forms of abuse across their platforms on a 24/7 basis.

Google SA had partnered with the Independent Electoral Commission (IEC) to train political parties and election monitors on how to use the products for campaigning and to spot and combat misinformation, and also had partnerships with Code for Africa and Real 411.

Roundtable discussion

Google SA

Mr C Mackenzie (DA) thanked Google SA for attending the meeting and said that the only reason for not attending Parliament would be if “you had something to hide,” and clearly Google SA did not feel that way. He said there was a local Google SA office on William Nicol Drive. How many people did Google SA employ locally, and what was the diversity of its employees? Was the local advertising revenue kept track of? Was this declared and if so, did Google SA comply with the relevant SA tax regime in terms of VAT and income tax? Lastly, did Google SA pay tax on the advertising revenue that was generated from South Africa?

He asked if Google’s policies were aligned with the new Protection of Personal Information Act (PoPIA) and if not, what it would do to bring their policies in line with the Act. He asked Google SA to confirm if the Google Assistant listened to users constantly.

With regard to search results, he had searched for the local elections 2021, and it had shown the IEC website as the first search appearance. However, when he had changed his Virtual Private Network (VPN) to London, a different result had come up. Did Google make use of users' Internet Protocol (IP) addresses to track their location and direct particular content to users?

Referring to search engine optimisation, he said there was a way to ensure certain pages were on the top of a search page, and said this spoke to digital reputation management. As an example, if an article were released which said that his beard was too white, his family and friends could merely release articles which said his mustache was a “remarkable shade of grey,” and that article would eclipse the white beard story. Was this level of manipulation possible? On what sort of scale did this happen in terms of affecting the legitimacy of results?

He said that when monitoring content, to find a Donald Trump supporter was as difficult as finding someone who had “voted for apartheid”. However, these supporters still existed. The censoring of Donald Trump, not only on Twitter but also on YouTube, was “very disturbing”. He also found it disturbing when leading voices, no matter how hateful, were removed or censored. It was important to allow these people to stand on a platform and speak, in order to engage and challenge their point of view. This would allow for possible conversion or persuasion away from their hateful thoughts. Who had made the decision at Google to “cancel someone who 75 million people voted for?”

Alphabet was the holding company for Google and owned a number of companies, including Waze, Fitbit, Google and YouTube. Thus, if a Google account was linked across multiple devices, it would have access to users’ locations, where they had travelled to, for how long they were gone, which restaurant they had visited, or how fast they were driving. He asked for Google to comment on their “unparalleled” ability to track and trace individuals.

Lastly, he asked Google to comment on their operations in China and Myanmar. Did the government there censor Google’s content? What had the reaction of the Myanmar regime been in terms of censoring Google’s content and channels?

Ms Faku (ANC) referred to the use of algorithms for social media content and influence of content generation, and asked to what extent Google was using algorithms for misinformation and harmful content.

She said South Africa had a lot of civil organisations, and asked how it was ensured that their contributions would sustain the democratic ideals of South Africa, such as the restoration of its social fibre. Moreover, what should the role of social media networks be in promoting the constitutional provision related to democracy within the country? She asked for more detail regarding the dissemination and consumption of information on the Google platform.

What partnerships did Google have with local or regional security authorities to ensure adequate enforcement of the law on these platforms in terms of cyber security, particularity in light of the case of the young girl in Limpopo?

She asked if there were any projects, or projects being planned, which supported the micro-economy in South Africa in terms of investment, strategy and future outlook.

How young people had been trained in the country? She had acknowledged the partnership with the National Youth Development Agency (NYDA).

Lastly, what role did Google play on the issue of artificial intelligence (AI) and machine learning (ML)?

Google's response

Dr Mokoena said that Google had been operating within the African continent for 15 years and employed 150 employees Africa, with 70 employed in South Africa at its Bryanston office. There was a 50/50 split in terms of gender and race, and they were about to embark on their black economic empowerment (BEE) audit the following month.

He said that Google did pay tax and complied with all tax laws.

Google had worked with the general data protection regulation (GDPR) whilst preparing for PoPIA before it came into force, so there were policies on privacy.

Personal information, privacy and consent were important words. Google makes it clear and easy for users to opt in or opt out in terms of sharing their information. Google makes it clear why information was gathered, and that its mission was to make the world's information accessible and useable. This was something Google worked hard to achieve. However, there was a fine line between the technology being too helpful and Google having too great an access to personal information.

Google had partnered with the Yes Foundation and the Praekelt Foundation to address the youth employment issue and had committed to train half a million youths to be “job ready” last year. It had partnered with Coursera to drive digital skills training. It supported micro businesses and had done a lot of work within the South African tourism industry to support small businesses.

Lastly, he said Google was a part of the Institute for Intelligence Systems at the University of Johannesburg (UJ) regarding AI and ML.

Mr Charles Murito, Country Manager, Google Africa, noted the seriousness of the engagement with the Committee, and its view that open dialogue was important. He said there was no way that manipulation of search results could be done, because Google raised information which was authoritative, and leveraged how useful that information was. He encouraged Members to look at a video on YouTube called “how search works,” which would provide an insight into how searches work and how Google leveraged the quality of the search, how many people bounced back once they had landed on a particular site, and how the non-authoritative pages were deprioritised. Searches were driven by the audience and their usefulness, and the key elements were to follow through in engagement with users.

Referring to the Trump comment, he said Google took their terms and conditions and community guidelines seriously. Content which was against terms and conditions would be taken down. This was not censorship, but rather making a good environment for users. Google did not want situations, like in Limpopo, where they could be held liable or create a situation which could deliver cyber bullying.

He said that Google did use the IP address to track the location of users as a source of information in order to deliver the most useful information to the consumer. However, the information was completely in the control on the consumer. He encouraged users to visit google.com/myaccount in order to see the information which had been collected. The consents which users had given were able to be paused or cancelled. Users had the authority, control and mandate to clear the information Google had collected.

He said there were several ways to increase local content, including the number of local businesses on the platform, through "Google my Business." This would increase the utility of Google platforms. If a user were to Google a plumber in their area, and a plumber appeared 100kms away, this would not be useful content.

Lastly, Google did have the AI principles which had been published. Users could go to Google AI principles, and it would be clear to see the key focus areas and if they were useful and reflected the users.

Ms Yolanda Mlonzi, Senior Analyst, Government Affairs and Public Policy, Google, said that online safety was a priority and educating young people about social media, accessing information and misinformation, was important. The Webrangers programme, partnered with Media Monitoring Africa and the Film and Publication Board (FPB), taught children about misinformation and cyber bullying over nine months. The events in Limpopo made it that much more important to take this seriously.

She highlighted that Google was looking to partner with various relevant departments with a view to scaling online safety content in schools.

Lastly, she agreed that there could be more done in terms of training. Once again, she noted that Google was open to partnerships and perspectives on gaps, to see what more could be done.

Dr Mokoena said that there were products which could be used for internet safety, including Google families, family groups and family links. These products provided parents with useful tools to manage children and their use of the internet.

In order to aid democracy in the country, through Google Arts and Culture, it had attempted to archive history by working with the Nelson Mandela Foundation, the Steve Biko Foundation, as well as their charity on Google.org.

Ms Faye asked the FPB to comment on the issues relating to the protection of children online.

Ms Abongile Mashele, Chief Operating Officer (COO), FPB, said that operators who entered South Africa had to meet certain requirements. These included having adequate mechanisms to track and detect child sexual abuse material. Moreover, there was multinational cooperation between governments and hotlines across the globe which worked with law enforcement and operators to identify and remove child sexual abuse images. In SA, the legislation still referred to child pornography material, but that the global term was child sexual abuse material. Google had these mechanisms in place. On the rare instance that this would occur, the FBP would engage. However, it had never really identified or raised concerns, as this had occurred only on user generated content, which FPB had picked up.

The FPB did track whether websites had the necessary age-gating systems to ensure that children would not have access to inappropriate material. For Video on Demand (VOD) platforms, it had agreements which ensured that age restrictions were applied. Online platforms used various technological means to age-gate their platforms. The FPB continuously acknowledges and monitors the licensing agreements with the platforms.

Lastly, she said it was important to educate the consumers, and the FPB had invested in resources to educate consumers regarding safety measures online for consumers and children online.

Ms Cynthia Lesufi, DCDT, said that the Department worked “tirelessly” to address the issues of children online. The International Telecommunication Union (ITU) had recently adopted guidelines on the protection of children online, and as the Department was a member of ITU, it was working closely to ensure that the guidelines focus on how children could employ strategies when dealing with the risks of online interactions and how parents could protect children online. This would guide policy makers and educators on strategies and interventions to deal with issues of children online. She highlighted the chief directory, which focused mainly on children, youth and people with disabilities, and was aimed at engaging with platform owners on how to make use of the guidelines and make sure children were protected online.

Discussion -- round two

Dr Vukosi Marivate, from the University of Pretoria, specialising in ML, AI and language, asked how many of Google's 150 employees on the continent and 70 in SA were directly involved in moderating content as opposed to policy making. How many employees on the continent were involved in engineering roles and building direct tools for the core of Google? He used Covid-19 as an example for the spread of misinformation, and asked Google to clarify how misinformation was handled across the various social networking sites. He highlighted that information, especially misinformation, was able to spread from YouTube to Twitter or other social networks, and asked how Google managed this, as well as what it actually took to remove this content.

Mr William Bird, Director of Media Monitoring Africa, raised the importance of critical digital literacy skills as a key component for the digital future, not only regarding misinformation, but in general too. He referred to his partnership with Google in working with children aged 13 to 17, which addressed the issue of safety online and acquiring the skills of an effective “digital citizen,” whilst advancing into the fourth industrial revolution (4IR).

He applauded Google for working closely with Real411 on disinformation, and said that as SA headed into elections, there was a place over which the IEC had oversight that regulated in line with local law. He commented that it was an important initiative, not only for Google, but also for Facebook, TikTok and Twitter, as it was a multi-stakeholder initiative.

What plans did Google have around political advertising? What information would they be able to share, because there would be a political advertising repository where the IEC had oversight, and the aim would be to showcase the purpose of political parties. This would allow for people to see clearly if it was a credible advert from a political party or not.

Dr Jabu Mtsweni, Head: Cyber and Information Research Centre, Council for Scientific and Industrial Research (CSIR), said that misinformation was a complex issue. The process of taking down misinformation was “blurry,” as well as what was considered misinformation and disinformation, as not all misinformation was harmful. He asked for further clarity, above and beyond the four Rs, on the process and decision made around what was considered misinformation.

Misinformation was complex and could be spread across different channels, be that in newspapers or radio. How did Google collaborate across platforms, civil society, and institutions to curb misinformation?

He highlighted that QAnon conspiracy theorists were making use of Google podcasts as a means of distributing their ideologies, as well as profiting financially at times from this Google platform. How did Google circumvent that? What did it do with the money which had been made from the misinformation provided on these podcasts?

As the elections approached, the CSIR would like to offer their assistance with early warning detection systems to help detect misinformation. Lastly, he pointed out the importance of training and misinformation awareness campaigns.

Mr Mbhele agreed with Ms Mlonzi, and said that censorship was not a “good way to engage” with disinformation or misinformation, but rather through good and factual arguments. He commented that this was not extended to hate speech, but nonetheless that engaging in conversations would be more sustainable. He agreed with Mr Mackenzie on the issue of de-platforming.

He started by referring to a quote -- “we used to think that ignorance or stupidity is a result of a lack of access to information, but it turns out that that is not actually the issue”. He said this could highlight the need for critical thinking on the part of information users and media literacy skills regarding information accessed.

He said that Google operated within the “attention economy,” and that they had an incentive to channel information to users which appealed “to their biases,” How did it strike a balance between providing users with authoritative and true information, whilst also optimising the commercial aspect of targeted advertising? He said that this lent itself to people with extremist views being placed in “echo chambers.” Where was the balance between Google’s ethical mission and the commercial imperative which was rooted in the business model?

He gave an example. If he were a “flat earther” who utilised Google or YouTube to source evidence for this point of view, would he be exposed to contrasting views which challenged his point of view? He referred to the “YouTube rabbit hole”, whereby whilst watching YouTube videos, the recommended videos would be along a similar narrative. This could allow for people to “end up in the echo chambers.”

The Chairperson referred to the point that related to political parties and the opportunity to correct or relay quality content that rebutted fake news, and wanted further clarity on to what extent Google worked with the regulators, the IEC and ICASA, to ensure that there was no perceived bias in terms of the time given to each political party or independent candidates. How was this managed?

He commented on the importance of language regarding misinformation. Although English was the dominate language within South Africa, how did Google deal with the spread of misinformation in different languages?

Regarding timelines, he noted that within the period of political party campaigning there were also legal processes which may have to be resolved. He was pleased that Google had worked with the IEC, but when it was fought in court, misinformation continued to the detriment of the party or individual. Once a matter had been reported, how effective was Google in tackling an issue of misinformation?

He said that during the lockdown, fake news on Covid-19 had been dealt with very quickly. However, it was becoming clear that there would be misinformation regarding the vaccinations.

The Chairperson lost connection and became inaudible.

Google's response

Mr Murito responded on the issue of content removals, and said Google endeavoured to support and drive quality information. He said that removals worked by the algorithm which drives to remove misinformation. Over 80% of the content which goes against Google’s community guidelines would be removed before a single consumer had seen that content.

There were over 20 000 people across the organisation who worked on content moderation.  Google worked holistically as an organisation to ensure that its products were used across the world. The user needs were imperative, and creating products in a country-specific way would not be conducive to optimise accessibility and usefulness. It was important for the platform to be received across the African continent, and there were offices in South Africa, Nigeria, Accra-Ghana, Nairobi-Kenya and in Egypt. The people within these offices created and connected the “local flavour.” There were engineers in the Accra office, as well as in other sections of the continent. However, he noted the importance of engineers in Africa to work not only on “African solutions,” but on all solutions across the continent and globe. In order to promote interconnectedness, especially in relation to the African Free Trade Agreement (AFTA), Google would need to ensure that they were able to leverage the connectedness of the continent and products so it could be used around the world.

He said that Google placed great importance on moral interests, as without users there would be no commercial business. User trust was needed, and that was the main reason why it was very simple for users to access their information and understand how their information was used.

Lastly, on political advertising, elections were categorised as “sensitive events,” and therefore extra caution was placed on the content published online. He added that not all political content was distributed as an advert.

Ms Mlonzi said that Google worked with vendors and partners, and although their central team may not be based in SA, they did work with local partners in moderation and flagging. She highlighted a programme, Trusted Flagger, which allowed flaggers in SA to flag content on the Google platforms. Once these flags had been raised, Google would look, view and consult the flaggers. She said that there were local teams, and it was a collaborative effort – it was not only the 20 000 people who worked in isolation.

She said Google was still “grappling with misinformation,” and that these conversations were very important. It was a moving target, specifically regarding the resurfacing of misinformation which had been removed. However, Google did invest in technology and partnerships to work on these issues. She urged that continuous engagement took place, and for partnerships to be fostered.

Mr Mokoena said that there were engineers based in its Bryanston offices. Moreover, Google had programmes which aimed at spreading the use of local currencies and languages. He encouraged partnerships which would propagate African languages on the internet.

Ms Faye summarised the responses given by Google, and welcomed the partnerships which were occurring between the departments, entities and Google.

Google discussion -- round three

Ms Nelisiwe Dlamini, a researcher at the CSRI, asked how Google ensured that misinformation which had been removed would not come back in a different form. How was Google proactive about recycled information? Did they have a way to identify patterns in information and ensure it did not come back?

She said that although content which had been put on YouTube was taken down, was there a way to “debunk” the stories which had been taken down? This would be in line with digital literacy. She suggested making use of “influencers,” as they had a large footprint and would be able to reach the youth, particularly around election times.

Mr Mackenzie asked if Google Assistance was compliant with the PoPIA, and if Google would provide clarity on the censorship issues within China and Myanmar.

Google response

Mr Murito said that Google had been engaging with the PoPIA legislation regarding their business, and was working with personal identifiable information (PII). They were taking the steps to register a content information officer, which would be in line with the regulations. This would then look into Google Assistant, as well as all Google products available in South Africa.

He said that Google did have people who worked in China, but its platforms were unavailable owing to the great internet wall which restricted its services.

Regarding recycled misinformation, he said that the Four Rs were used to removed content which promoted misinformation, and they would rather raise authoritative information. Google worked with publishers across the globe to raise content, and within SA they worked with Media 24, organisations, election boards and Media Monitoring Africa to raise content and disseminate authoritative information.

Lastly, he reiterated that over 80% of content which went against community guidelines had been removed before anyone had been able to see it. Beyond the human capacity, Google had the technological capacity to “catch misinformation.”

Dr Mokoena said that Google Assistant was a product which was aimed at making users' lives easier, but it could be turned off at any point if users were uncomfortable.

Influencers could reach millions of users, but there were risks when working with them. He said that Google had trained journalists, and that “loads of work” was happening.

Lastly, around six million individuals had been trained in digital skills, and Google had developers across the continent. It needed people on the continent who understood African needs.

Ms Faye agreed that it was important to be aware when accepting privacy terms and conditions, as well as digital consent. She stressed that when applications were downloaded by users, access was given to these applications in terms of their information, and it was something people should be aware of. It should be seen as a “two-way street,” whereby there was a right to decline access to information but also to have a responsibility to read the terms and conditions. She highlighted the importance of digital literacy.

Google discussion -- round four

Dr Mashilo Boloka, DCDT, questioned the monetisation of content by independent content makers. What was Google’s stance regarding paying content creators for their content which could be used on platforms such as Google? This was not only for the public good in terms of content, but it was a copyright issue. He commented that if creators were being paid in developed countries, the same must be done in Africa.

He questioned if Google was a news carrier, even if through intermediaries, or if they subscribed to any codes or standard in order to comply with prescribed news standards?

Ms Faku (ANC) asked if there had been any surveillance on the impact of the training which had been conducted by Google. She asked Google to comment on the recent privacy controversy with reference to WhatsApp and users moving to other social networks, such as Signal.

Mr Trevor Rammitlwa, Chief Operating Officer (CEO), National Electronic Media Institute of South Africa (NEMISA), asked how digital training was done and what lessons had been learnt. He indicated that NEMISA was charged with the responsibility of delivering digital training, and offered to partner with Google and the FPB to go into the communities and offer these skills.

Ms Lebo Leshabane, Chairperson of NEMISA, asked if Google had any concessions for institutes like NEMISA. She said the entity was tasked with digital training for communities in the country, particularly for disadvantaged youth and women, and would like to use platforms like Google to promote their courses and to access beneficiaries. This would allow for greater participation in the NEMISA courses. 

Dr Busisiwe Vilakazi, Acting Executive of the State Information Technology Agency (SITA), was inaudible.

Google response   

Dr Mokoena said that Google worked closely with social partners to track the progress of training, as well with as key performance indicators (KPIs) which monitored the initiatives. Moreover, because there was a small team on the continent, Google worked with partners to allow for the scaling of the services, such as Africa 118. He suggested that “we should chat to see how we can partner on these initiatives.”

Mr Murito said it was important to understand how Google worked. It was a search engine which surfaced content for which people were looking. It did not create content, it did not publish news and it was not a news agency. Google made money through advertising, and the majority of the share of the advertising revenue went back to the content owner. An AdSense partnership would be created, and the majority of the dollars from the advertisement would go to the content creator.

He used the Yellow Pages directory as an analogy, to highlight how it allowed for content creators to be found by users, it allowed for advertising, and then a share of the money would be given to the creator.

Lastly, he said he was unable to speak on behalf of the other platforms, as they were not at the meeting themselves.  

Google discussion -- round five  

Dr Vilakazi asked for the actual number of engineers who were local and were developing the technologies for the core of Google's services. She said that this was an important question, because diversity mattered, especially in relation to AI and how certain groups were represented. It was not about creating technology for Africa only, but for the globe, whereby Africans contributed to the overall technology.

She asked what Google was doing to support local innovation and innovators to be able to develop the technology which Google offered. This would allow for the African continent to be not only a consumer, but also producers of technologies and development.

Mr Bird said that one issue was on programmatic advertising and media sustainability. A challenge was that companies would buy programmatic advertising through the Google platform, as it was a cheap and effective way for brands to reach different audiences. However, these brands could then run the risk of appearing on “dodgy”” websites. He said that there were websites which were created with the intention of spreading misinformation, and then made money from these websites. However, this would run against the democratic responsibility and ethical considerations of advertising and making money. What could Google do to ensure that credible brands did not appear on dodgy websites?

Google's response

Mr Murito responded to Dr Vukazi’s question, and said that the engineers in Accra and other offices were working not only on African solutions, but on all Google products. The African e-Conomy Report 2020 outlined the country-by-country analytics of engineers, technology experts and developers, and what the potential of the internet economy could be by 2025 -- which could be $180 billion in terms of economic contribution to the continent. Thus, Google had continued to employ engineers around the continent, because diversity and inclusion was taken seriously. He said that Google was consistently adapting their searches and services to be accessible to all, no matter if they were in Africa or elsewhere. Thus, the exact number of engineers was unnecessary, as the situation was constantly evolving.

He said that Google was currently hiring a Director of Cloud -- an engineering position -- which would be based anywhere across the continent. Other engineering posts had been filled in Kenya, Ghana and SA. He said that more Africans were being hired to take more jobs.

One programme that Google had was Launchpad Africa, which worked with start-ups across the continent. So far, Google was working with 27, and had helped them to scale up in terms of grants, mentorships and cloud hosting. This was to build business, not only in their own home countries, but so these solutions could be accessed by the continent and beyond. He encouraged Members to read the e-Conomy report, as well as the White Paper and the Digital Sprinters Report, which highlighted the four key pillars that were vital to the economic growth of the digital economy. The first pillar addressed human capital and the need to drive digital skills in order to ensure that Africans had the right skill set needed for the fourth industrial revolution, as recommended by the President.

The second pillar addressed physical infrastructure, and noted that people in remote areas should be able to access the internet. If they were unable to get online, they would be unable to access the internet economy. This would allow for the cost of data to be reduced. The third pillar was on the adoption of technology, especially in the government, to drive efficiency. Lastly, the fourth pillar addressed competition and wanted to ensure that there were the correct regulatory frameworks in place.

He highlighted that the Asean Free Trade Area (AFTA), as well as the 37 countries that had ratified it, as well as the digital protocol in process, which should be completed by early next year, would amount to $3.7 trillion, and that the marketplace could be leveraged across different areas.

Lastly, he said that it was important to create centres of excellence and export those things, such as Fintech. A pan-African perspective was needed in order to grow the economy of the continent.

Dr Mtsweni commented on the issue of regulations. He said that combating misinformation was complex and required multiple stakeholders. Legislation was key, and he asked if Google believed that self-regulation was the best approach, or if the Government should put baseline regulations in place, such as PoPIA or the Cyber Crimes Bill. He said that people who promoted misinformation could be doing so innocently, but there could also be criminal activities. During the times of elections this could manifest in criminal activity for profit, for political influence or propaganda. Moreover, the misleading content could be dangerous for national security measures too.

Dr Vukosi said that they should not see each other as adversaries, and commented that they were all from different backgrounds, sectors or entities. The question of local engineers and local monitors was being asked because, especially during elections, there would be issues which would occur which were specific to SA. Thus, it was necessary to have local engineers and monitors who understood the SA context and engaged with civil society and journalists. He mentioned the issue of astroturfing on platforms such as YouTube, TikTok and Twitter.

He highlighted the case of the storming of the Capitol Building in the United States earlier in the year, and how the local engineers and monitors had a direct line on what had happened. He commented that a localised team was needed to talk through the processes. A common understanding was necessary.

Google's response

Mr Murito said that with programmatic advertising, any key product must abide by AdSense's policy to be on any platform. If any website were to go against the community guidelines, a ban would be issued and no AdSense content would be issued.

On local participation, he emphasised that there was a lot of local representation on the continent, and in SA specifically. They had a very strong partnership with Real 411, whereby different misinformation was flagged. Moreover, there was a direct line whereby Google could be contacted directly, and the classification of elections as “sensitive events” did increase the caution around content.

He gave an assurance that Google was doing everything it could. He agreed that they were not adversaries, and said that they should all work together on these issues whilst having the responsibility to have the right information, flag content, raise issues and deliver the best work. The digital economy would offer an opportunity for the continent, especially regarding youth unemployment.

Lastly, as had been mentioned in the White Paper, Google was in favour of smart regulations. Google had worked with various Government agencies in SA to craft the right regulations. Being in favour of smart regulations would allow for Google and their services to work within the country’s guidelines. They had collaborated with the FPB, and believed that it was important to work with regulators to ensure the best practices.

He ended by stating that it was impossible to manually classify every piece of information uploaded on to YouTube. Over 500 hours of content was uploaded to YouTube every minute. Therefore, it would be humanly impossible to categorise that content. They would rely on algorithms and human intervention to deliver the content which was in line with the community guidelines, as well as user flags.

Ms Faye thanked the entities, Google and Committee for their contributions. She acknowledged the robust conversations, partnerships in the future, and the gaps which remained. Misinformation and content monitoring had been a point of contention for FPB and ICASA as regulators, so this had been an insightful meeting.

The Chairperson thanked the Committee for their participation, as well as Google for their presentations, responses and willingness to form partnerships.

He thanked Ms Faye for her role as facilitator of the roundtable conversations, the support staff, Department and entities.

Mr Murito said that there would be an Africa Day concert on YouTube at 18:00, with the Minister contributing to it.

The meeting was adjourned.

Share this page: