Containerize Notebooks
In this tutorial we will show how to use containerized notebooks using the chatgpt 2 model from hugging face.
Video
Video is still in development.
Requirement
Downloads
- Notebook (Link)
Setup
If you don't have a database, schema or warehouse yet.
use role sysadmin;
-- Create a database to store our schemas.
create database if not exists raw;
-- Create the schema. The schema stores all our objectss.
create schema if not exists raw.notebook;
/*
Warehouses are synonymous with the idea of compute
resources in other systems. We will use this
warehouse to call our user defined function.
*/
create warehouse if not exists development
warehouse_size = xsmall
initially_suspended = true;
use database raw;
use schema notebook;
use warehouse development;
GPU Compute Pool
First lets start by creating a gpu compute pool for our notebook via accountadmin role and then grant sysadmin to use it.
External Access
Lets create the network rules in a worksheet to allow our Snowflake Notebook to talk with our external source.
Notebook
You can not use accountadmin. Please switch to sysadmin role.
Lets start by setting our role to sysadmin.
Setup
Upload the example notebook provided.
Select Run on container, ML runtime and the gpu compute pool we created earlier.
To enable external connection to pypi and hugging face we will need to go to notebook settings.
Enable both pypi and huggingface.
Click run all, the notebook will provide you a timer. It typically takes 5-7 minutes.
Result
Once the notebook runs you will see we first make a directory for files to be downloaded to.
We then download the chatgpt 2 model and save it to our files folder.
We finally use our model and gpu to provide a prompt and get a response back.