I taught philosophy to GPT-2 and it felt I am talking to a deranged but intelligent person
I merely wanted to have a decent conversation but GPT had other plans
Published in
5 min readFeb 1, 2021
By now each one of you might have seen the tweets, apps, codes, texts produced by the mighty GPT-3 model of OpenAI. The 175 Billion parameter model was trained on a large corpus of data of the size of the internet. It has already amazed people in both tech and non-tech space with its real-worldly results and folks are drooling over the possibilities in the coming…