projects: https://github.com/madjin
notes: https://hackmd.io/@xr/book
then take static html result, chat to it with o3 or something
then claude code or whatever, ez gud frontend workflow
I made this in a day: jedaicouncil.com
then take static html result, chat to it with o3 or something
then claude code or whatever, ez gud frontend workflow
I made this in a day: jedaicouncil.com
Also bit jelly seeing ppl run self hosted LLMs on macbooks, but unless you travel a lot can just have a dedicated machine do that stuff on your local network
Also bit jelly seeing ppl run self hosted LLMs on macbooks, but unless you travel a lot can just have a dedicated machine do that stuff on your local network