No. I'm saying that the large Transformer models are a symptom of the centralization of monopoly rents in the hands of people who have no business allocating capital, let alone allocating capital in ways that permits them to cripple advances in the social sciences that might expose their monopoly rents as corrupting allocation of capital -- and, instead, raise red-herrings about "race" and "gender" and "xenophobia", ad nauseum.
Where is Google's big investment in sparse matrix multiplication hardware, for example? Why the reliance on dense models when it is known that's not how the brain works? Doesn't Google have enough cash lying around to do something along these lines? And why the emphasis on "big" models when it is known that minimizing the size of the algorithm that outputs the data is the optimal model selection criterion? Might it have anything to do with trying to HIDE that ground truth of AGI by throwing vast amounts of money around that they shouldn't have in the first place? And, finally, there is your misconception that Transformers are, in some sense, recurrent -- a misconception advanced by Google's own paper titled "Attention Is All You Need", leaving you here to do their dirty work for them in your Google induced brain fog. On Thu, Dec 23, 2021 at 8:19 PM <[email protected]> wrote: > Oh, are you saying that these large Transformer models are so truthy that > the people aren't allowed to see the truth in them? > *Artificial General Intelligence List <https://agi.topicbox.com/latest>* > / AGI / see discussions <https://agi.topicbox.com/groups/agi> + > participants <https://agi.topicbox.com/groups/agi/members> + > delivery options <https://agi.topicbox.com/groups/agi/subscription> > Permalink > <https://agi.topicbox.com/groups/agi/T358f938c1cfb5c51-M458bdad399edfe0a0d7be3f4> > ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T358f938c1cfb5c51-Mb857a178c5874cf0e3cc049d Delivery options: https://agi.topicbox.com/groups/agi/subscription
