On-device Large Language Models with Keras and Android

preview_player
Показать описание
Learn how to load a large language model built with Keras, optimize it, and deploy on your Android device!

Resources:

Speaker: Wei Wei

Watch more:

#GoogleIO

Products mentioned: TensorFlow - General
Event: Google I/O 23
Speaker: Wei Wei
Рекомендации по теме
Комментарии
Автор

Have a burning question? Leave it in the comments below for a chance to get it answered by the TensorFlow team. 👇👇🏻👇🏿👇🏽 👇🏾👇🏼

TensorFlow
Автор

So, GPT-2 can run on android devices, with a few delay responses, but of course GPT-2 isn't good as GPT-3 or 4.
1) How many years do you think that we need to have gpt-3 on our android phones?
2) What task could improve to have an agent like this in the phone?
3) We could have better prediction in our screen keyboard to the next word?
4) Recollect all chats whereas we are the sender and using it to feed the LLM and therefore setup automatic responses when we are out of our phone?
5) An ultimate advanced-reasoning virtual assistant better than google assistant and siri?
6) There is some security warnings about having an LLM like this in our phone? And if there is, what are the most recommended advices to handle and llm in our phone in the secure way?
7) And finally, what other IA types will be available for our phone ? I mean, speech-recognition, image generation, etc...

octaviusp
Автор

Debugging tensorflow's colabs is just so much fun and totally is not the waste of my life...

SashaBaych
Автор

I realize the tutorial is 9 months old, but you could at least update the codebase...

SashaBaych
Автор

Can tensorflow lite be trained directly on microcontroller? meaning instead of training tensorflow on pc then converting to tensorflow lite and upload to microcontroller to run it there i want to directly train tensorflow lite on microcontroller, is it possible? thank you

jomfawad
Автор

Can you report the speed, latency or memory occupation of this application running on android?

notwomy
Автор

I am trying to implement this using tflite_flutter but i keep getting "Make sure you apply/link the Flex delegate before inference. For the Android, it can be resolved by adding dependency flutter" error. i went ahead to add the dependency in my build.gradle file but the issue still persist. Does the Tensorflow team by chance have any implementation of LLM models in flutter? if yes I'd love a link to the article/video because i've been stuck on this for weeks

flutteraddict
Автор

I am wondering whether I can achieve on-device training, i.e., using local mobile data to fine-tune LLM.

knl-ibxo
Автор

Will this work on the web with tensorflow JS instead of android?

Canadianishere
Автор

I have a question about YAMNet TensorFlow lite model (Android app). I want to use it with an audio clip as input, Not a live recording.

Can you help in that? Thank you for your help

alanood