[Governance] How strongly captured by Big Tech is the field of AI research

parminder parminder at itforchange.net
Sun Nov 21 02:19:51 EST 2021


I think I may be excused to think that Suresh Ramasubramanian is a bot
that dutifully acts within 60 minutes of my every posting to this list,
to trash it, choosing some random part of my message and aligning it
with some criticism selected from publicly available  data :) ...


List managers, any way such bots can be  turned off :)  ... thanks,
parminder



On 21/11/21 12:41 pm, Suresh Ramasubramanian wrote:
> Poor developing country? I’m kind of scared about big data in this
> current Indian government’s hands even more than they currently have
> access to, which with aadhaar is a lot.  
>
> Given the uses to which voter data has already been put nearly two
> decades back
> .. https://m.timesofindia.com/india/misuse-of-voters-list-in-gujarat-riots-alleged/articleshow/3541858.cms
> <https://m.timesofindia.com/india/misuse-of-voters-list-in-gujarat-riots-alleged/articleshow/3541858.cms>
>
>
>
> --srs
> ------------------------------------------------------------------------
> *From:* Governance <governance-bounces at lists.igcaucus.org> on behalf
> of parminder via Governance <governance at lists.igcaucus.org>
> *Sent:* Sunday, November 21, 2021 11:56:26 AM
> *To:* Mueller, Milton L <milton at gatech.edu>;
> governance at lists.igcaucus.org <governance at lists.igcaucus.org>
> *Subject:* Re: [Governance] How strongly captured by Big Tech is the
> field of AI research
>  
>
> Milton, happy to respond to substantive points, but disappointed that
> you chose to attack this outstanding article on extraneous even ad
> hominem grounds. I'd come to them in a different email, not to
> distract from main points.
>
>
> You ask in the end. "What exactly is the problem being identified here
> and what is proposed as the solution?"
>
>
> I thought Meredith had made both absolutely clear. the problem
> especially, which is extremely profound and disturbing, and also her
> 'solution' (focused on tech employees and academics), which btw I do
> not think is adequate.
>
>
> The problem is that in a world where AI is set to become a prime force
> determining much in all walks of life, it is nearly impossible to do
> any AI research without Big Tech weighing down heavily on it. Whether
> it be about how our education processes and content should be
> determined, to the basis of public health strategies, to almost
> everything. I find it incredulous that you see no problem in this.
> Meredeth herself points to the time when military had an
> extra-ordinary say in a lot of scientific and technical research, and
> she calls it a 'dark history' from which we should take lesson. But
> then at least even if technologies like Internet and others were born
> in defense labs, when theses technologies and applications entered our
> lives, these were mediated by business and others, which meant a
> useful division of power in determining the directions of development
> and use of these technologies. But today, from framing questions, to
> research to application to feedback, it is a single Big Tech captured
> system. That, dear Milton, is the problem. It is just that we do not
> want commercial logic of a few Big Tech owners to determine the future
> of the world and humanity, whether this disturbs you at all or not.
>
>
> Access to all the needed data, it, along with means of collecting it,
> all being with Big Tech is the biggest problem, even bigger than high
> computational power, which with the help of public resources may still
> be able to be mustered. But all data and its mining shafts are captive
> with Big Tech. That is one of the biggest contemporary problem, and
> many have recognized it as such.
>
>
> To which your flippant response is: "Would we feel better if all the
> data was being collected and research done by nation-states?"
>
>
> Well, no, that is not what we want.
>
>
> Your next, rhetorically meant ,question is in fact more to the point:
> " Or do we fantasize about the large amounts of capital and data
> somehow being magically in the hands of "the people"? In what
> institutional capacity? "
>
>
> No, we do not fantasize. There is real work happening on this.
>
>
> I see two ways of doing it - both being actively pursued by many
> worthwhile actors.
>
>
> (1) Making many kinds of data sharing mandatory for Big Tech. So that
> data they collect from the society is mandated to be contributed in a
> social commons that all or many can use. I know you would consider
> this fantastical. But India's committee on data governance framework
> has laid out elaborate legal basis as well as practical implementation
> means for achieving it. (Disclosure: I am a committee member). This is
> the second draft report
> <https://ourgovdotin.files.wordpress.com/2020/12/revised-report-kris-gopalakrishnan-committee-report-on-non-personal-data-governance-framework.pdf>,
> but the final one, quite a bit better, will be out soon. But I know
> how youd treat a policy document from a poor, developing country like
> India. So I may inform you that within a few months EU will be coming
> out with its Data Act that will contain some mandatory data sharing
> provisions. Apart from it, the EU is working on many projects to
> ensure sector wide data availability and sharing, like its GAIA-X project.
>
>
> One of the man objectives of wide data sharing through the above means
> is to decentralise digital business concentration with a few Big Tech.
> It also works the other way which brings me to my second point.
>
>
> (2) Breaking up big tech by various legal means -- employing platform-
> dependent actors separation (Lisa Khan and in India's ecom law) or in
> other ways ( see for instance our paper on 'Separating data, cloud and
> intelligence layers
> <https://datagovernance.org/report/breaking-up-big-tech-separation-of-its-data-cloud-and-intelligence-layers>'
> )  .... Even the US is mulling new laws to curb Big Tech's power and
> ensure more competition, as are many other jurisdictions. Enforcing
> platform interoperability is a good way to break platform power
> concentration (we are working on a proposal to do so for social media
> platforms to start with, while India's commerce ministry has a group
> to develop a platform for e-com interoperability) ... Once you have a
> multiplicity of actors and platforms in any sector, instead of a
> monopoly or two, it mitigated the problem of data availability, as
> well as of concentrated control over AI research..
>
>
> parminder
>
>
>
> On 21/11/21 12:08 am, Mueller, Milton L wrote:
>> I look at this article and see no measurements or even qualitative
>> estimates of the sources of AI funding, the share controlled by "big
>> tech" as opposed to universities,  government civilian research
>> institutes, or by governmental military projects.
>>
>> All I see are assertions that everything is under the control of "big
>> tech" - although there are useful notes about how governments - not
>> big tech - are now pouring money into "AI" research based on a
>> narrative about geopolitical military competition. And our project
>> (IGP) has already sounded the alarm about Schmidt's and the U.S.
>> military's attempt to make AI research into a "race" with the
>> Chinese. More specifically, they are aiming at "AI supremacy," a
>> rather scary term in our opinion, but in that case the source of the
>> problem is nation-state competition not a demonized big tech.
>>
>> The author, Meredith Whitaker, ironically, is a former Google
>> employee and sang a quite different tune when she participated in
>> multistakeholder IG organizations in that capacity.
>>
>> This article may be worth paying attention to, but it's not research,
>> it wasn't written by a scholar, it's basically an opinion piece
>> upholding the advocacy views of the organization Meredith now works
>> for, whose position is that big tech is bad and should be controlled
>> more by people like, uh, Meredith and Tristan Harris and Francis Haugen.
>>
>> I'd also remind you not to be manipulated by framing. AI is basically
>> software interacting with large data sets. The idea that commercial
>> platforms who generate and collect lots of data and provide software
>> applications and tools to billions of users are at the forefront of
>> AI research should surprise or shock no one. It's like saying that EV
>> companies are at the forefront of battery research. Would we feel
>> better if all the data was being collected and research done by
>> nation-states? Or do we fantasize about the large amounts of capital
>> and data somehow being magically in the hands of "the people"? In
>> what institutional capacity?
>>
>> Is the message that we are supposed to be against AI research per se?
>> Or to eliminate big tech companies altogether? Or to be against
>> "capitalism" because, well,  the Chinese Communist Party does so much
>> nicer things with big data and does such a better job controlling its
>> tech companies? Is the goal to "regulate" big tech? If so, how,
>> exactly, and how does that prevent tech/data firms from being at the
>> forefront of AI research anyway? What exactly is the problem being
>> identified here and what is proposed as the solution?
>>
>> Thoughts to consider as we prepare to meet in IGF 😉
>>
>> Dr Milton L Mueller, Professor
>>
>> School of Public Policy
>>
>> Georgia Institute of Technology
>>
>> Internet Governance Project <https://internetgovernance.org>>
>>
>> ------------------------------------------------------------------------
>> *From:* Governance <governance-bounces at lists.igcaucus.org>
>> <mailto:governance-bounces at lists.igcaucus.org> on behalf of Suresh
>> Ramasubramanian via Governance <governance at lists.igcaucus.org>
>> <mailto:governance at lists.igcaucus.org>
>> *Sent:* Friday, November 19, 2021 5:21 AM
>> *To:* parminder <parminder at itforchange.net>
>> <mailto:parminder at itforchange.net>; governance at lists.igcaucus.org
>> <mailto:governance at lists.igcaucus.org>
>> <governance at lists.igcaucus.org> <mailto:governance at lists.igcaucus.org>
>> *Subject:* Re: [Governance] How strongly captured by Big Tech is the
>> field of AI research
>>  
>>
>> In other words, civil society needs to work on its own AI, and on AI
>> ethics, as a multi stakeholder effort. Are you aware of any such
>> efforts?  Or do you plan to launch such an effort?
>>
>>  
>>
>> *From: *Governance <governance-bounces at lists.igcaucus.org>
>> <mailto:governance-bounces at lists.igcaucus.org> on behalf of parminder
>> via Governance <governance at lists.igcaucus.org>
>> <mailto:governance at lists.igcaucus.org>
>> *Date: *Friday, 19 November 2021 at 3:20 PM
>> *To: *governance at lists.igcaucus.org
>> <mailto:governance at lists.igcaucus.org>
>> <governance at lists.igcaucus.org> <mailto:governance at lists.igcaucus.org>
>> *Subject: *[Governance] How strongly captured by Big Tech is the
>> field of AI research
>>
>> An excellent and eye opening article in ACM's journal on how strongly
>> captured by Big Tech most AI research is today. There also seem not
>> many alternatives on the horizon.
>>
>> https://interactions.acm.org/archive/view/november-december-2021/the-steep-cost-of-capture
>> <https://interactions.acm.org/archive/view/november-december-2021/the-steep-cost-of-capture>
>>
>> If AI is the future, and everything AI is shaped by the Big Tech,
>> then public interest actors of the world have something that must be
>> addressed urgently..
>>
>> But many apparently are busy handing even tech governance spaces over
>> to Big Tech, so nothing ever comes in the latter's way.
>>
>> parminder
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.igcaucus.org/pipermail/governance/attachments/20211121/95ad3a15/attachment.htm>


More information about the Governance mailing list