Image
Moores lag
Breadcrumb

DH-seminariet: Andrew Lison: Convolute (N)eural: Artificial Intelligence at the End of Moore's Law

Research

Andrew Lison, Assistant Professor of Media Study, University at Buffalo, SUNY, will give a seminar on Convolute (N)eural: Artificial Intelligence at the End of Moore's Law.

Seminar,
Webinar
Date
5 Feb 2021
Time
14:00 - 16:00
Location
Zoom

Why have we seen a tremendous amount of interest in neural-network-based artificial intelligence in recent years? Proponents of the technology point to three major reasons: larger sets of digitized information becoming available for training, breakthroughs in software techniques, and increases in processing power.

This talk offers another possibility, in some sense the opposite of this third point: while increases in speed may have made certain techniques newly feasible, material limits on the development of computing power have necessitated a move to systems, like neural networks, that can make the most efficient use of present-day hardware. Dictating an exponential growth in semiconductor density (and thus computing power), Intel Corporation co-founder Gordon E. Moore's "law" has shown signs of faltering for much of the 21st century, and the parallel-processing approaches promoted in light of this reality have proven to be ideally suited to convolutional neural networks. The contemporary push for AI-enabled software and even hardware, then, can be understood as a response to an imminent plateau in discrete, silicon-based computing power, itself a proxy, in a digitally-connected age, for socio-economic growth.

Image
Andrew Lison