Chairwoman Johnson's Floor Statement on H.R. 4355
Chairwoman Eddie Bernice Johnson (D-TX)
on H.R. 4355, the Identifying Outputs of Generative Adversarial Networks Act
I rise today in support of H.R. 4355, the Identifying Outputs of Generative Adversarial Networks Act.
“Deep fake” technology, which manipulates photos, videos, or audio clips to produce content that seems real but is not, has become increasingly commonplace in recent years. This increase in prevalence has been spurred in part by increases in computing power, the widespread availability of images and other data, and the use of artificial intelligence. In many cases, the applications of this technology may be benign. But bad actors can also use this technology to spread disinformation and cause great harm to individuals, to organizations, and to society as a whole. During the Science Committee hearing on online imposters and disinformation earlier this year, one of the witnesses showed us a demonstration of a deep fake video in which he swapped the likeness of two Members of Congress at the hearing.
Despite the spread and potential harm of deep fake technology, there are currently no sure-fire methods of identifying and distinguishing manipulated content from authentic content. The ability to differentiate between manipulated and authentic content is essential to maintaining our national and economic security and protecting against malicious use of these technologies.
H.R. 4355 leverages the strengths of the National Science Foundation and the National Institute of Standards and Technology by directing these agencies to support research on manipulated or synthesized content in order to help develop the standards and other tools necessary to detect this content.
I’d like to commend my colleagues Representatives Gonzalez, Stevens, and Baird for their excellent leadership on this bipartisan legislation and I urge all of my colleagues to join us in passing this bill.
I reserve the balance of my time.