Deepgram provides a variety of tools that you can use to configure and deploy our ASR services on-premise. This Quick Start shows only one strategy. If you plan to deploy Deepgram products on-premise using this strategy, follow this Quick Start. Otherwise, deploy Deepgram products on-premise by following the steps in Configure Hardware & Software, Get and Configure Deepgram Products, and Product Deployment.
This Quick Start supports the following:
p2.xlarge
with a Tesla K80 GPU and Ubuntu 18.04
To deploy using the supported deployment strategy, read on.
Create a cloud-init.txt
file with the following content:
bash#!/bin/bash LICENSE_KEY= DOCKER_USERNAME= DOCKER_PASSWORD= GENERAL_MODEL= echo $DOCKER_PASSWORD | docker login --username $DOCKER_USERNAME --password-stdin docker pull deepgram/onprem-api:latest docker pull deepgram/onprem-engine:latest docker logout cd /home/ubuntu/config sed -i 's/REPLACEWITHKEY/'"$LICENSE_KEY"'/g' api.toml sed -i 's/REPLACEWITHKEY/'"$LICENSE_KEY"'/g' engine.toml cd /home/ubuntu/ mkdir engine cd /home/ubuntu/engine mkdir models cd models wget $GENERAL_MODEL mv general*.dg general.dg chmod 644 general.dg chown -R ubuntu:ubuntu /home/ubuntu/engine cd /home/ubuntu/
Edit cloud-init.txt
to enter the appropriate values at the top of the file:
bashLICENSE_KEY= DOCKER_USERNAME= DOCKER_PASSWORD= GENERAL_MODEL=
For GENERAL_MODEL
, include the URL of the model provided to you by Deepgram. If you require more than one model or want extra features, like punctuation,
contact your sales representative, and we will provide an cloud-init.txt
customized to your needs.
Provide the contents of cloud-init.txt
as user data.
Start the AWS instance, ssh
to it, and start the Deepgram api
and engine
services by running:
bash$ cd /home/ubuntu/config $ docker-compose up -d