Quantcast
Channel: Active questions tagged python - Stack Overflow
Viewing all articles
Browse latest Browse all 13891

How to run ollama in google colab?

$
0
0

I have a code like this. And I'm launching it. I get an ngrok link.

!pip install aiohttp pyngrokimport osimport asynciofrom aiohttp import ClientSession# Set LD_LIBRARY_PATH so the system NVIDIA library becomes preferred# over the built-in library. This is particularly important for# Google Colab which installs older driversos.environ.update({'LD_LIBRARY_PATH': '/usr/lib64-nvidia'})async def run(cmd):'''  run is a helper function to run subcommands asynchronously.'''  print('>>> starting', *cmd)  p = await asyncio.subprocess.create_subprocess_exec(      *cmd,      stdout=asyncio.subprocess.PIPE,      stderr=asyncio.subprocess.PIPE,  )  async def pipe(lines):    async for line in lines:      print(line.strip().decode('utf-8'))  await asyncio.gather(      pipe(p.stdout),      pipe(p.stderr),  )await asyncio.gather(    run(['ollama', 'serve']),    run(['ngrok', 'http', '--log', 'stderr', '11434']),)

Which I'm following, but the following is on the page

enter image description here

How can I fix this?Before that, I did the following

!choco install ngrok!ngrok config add-authtoken -----
!curl https://ollama.ai/install.sh | sh!command -v systemctl >/dev/null && sudo systemctl stop ollama

Viewing all articles
Browse latest Browse all 13891

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>