OPT-175B

Meta is now offering the Open Pretrained Transformer (OPT-175B), a language model with 175 billion parameters. This groundbreaking technology has been trained on publicly available data sets and gives people around the world unprecedented access to understanding this new technology. For the first time, researchers and developers are able to get their hands on both pretrained models and code needed to train them, all under a noncommercial license that restricts usage for research purposes only. Academic researchers, government workers, civil society members, academics, and industry research laboratories will be granted access to OPT-175B.

You May Also Like.

Share Your Valuable Opinions