Description
Prerequisites
- Write a descriptive title.
- Make sure you are able to repro it on the latest version
- Search the existing issues.
Steps to reproduce
PS C:\Users\54718> ollama list
NAME ID SIZE MODIFIED
codellama:latest 8fdf8f752f6e 3.8 GB 15 hours ago
@ollama> How do I list files in a given directory
ERROR: Agent failed to generate a response: Response status code does not indicate success: 404 (Not Found).
at System.Net.Http.HttpResponseMessage.EnsureSuccessStatusCode()
at AIShell.Ollama.Agent.OllamaChatService.GetChatResponseAsync(IStatusContext context, String input, CancellationToken cancellationToken) in
D:\AIShell-main\shell\agents\AIShell.Ollama.Agent\OllamaChatService.cs:line 72
at AIShell.Ollama.Agent.OllamaAgent.<>c__DisplayClass23_0.<b__0>d.MoveNext() in D:\AIShell-main\shell\agents\AIShell.Ollama.Agent\OllamaAgent.cs:line 114
--- End of stack trace from previous location ---
at Spectre.Console.Status.<>c__DisplayClass17_01.<<StartAsync>b__0>d.MoveNext() in /_/src/Spectre.Console/Live/Status/Status.cs:line 120 --- End of stack trace from previous location --- at Spectre.Console.Progress.<>c__DisplayClass28_0
1.<b__0>d.MoveNext() in //src/Spectre.Console/Live/Progress/Progress.cs:line 133
--- End of stack trace from previous location ---
at Spectre.Console.Internal.DefaultExclusivityMode.RunAsync[T](Func1 func) in /_/src/Spectre.Console/Internal/DefaultExclusivityMode.cs:line 40 at Spectre.Console.Progress.StartAsync[T](Func
2 action) in //src/Spectre.Console/Live/Progress/Progress.cs:line 116
at Spectre.Console.Status.StartAsync[T](String status, Func2 func) in /_/src/Spectre.Console/Live/Status/Status.cs:line 117 at AIShell.Kernel.Host.RunWithSpinnerAsync[T](Func
2 func, String status, Nullable`1 spinnerKind) in D:\AIShell-main\shell\AIShell.Kernel\Host.cs:line 386
at AIShell.Ollama.Agent.OllamaAgent.ChatAsync(String input, IShell shell) in D:\AIShell-main\shell\agents\AIShell.Ollama.Agent\OllamaAgent.cs:line 112
at AIShell.Kernel.Shell.RunREPLAsync() in D:\AIShell-main\shell\AIShell.Kernel\Shell.cs:line 642
ollama server:
PS C:\Users\54718> ollama serve
2024/12/04 10:12:42 routes.go:1197: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:D:\workspace\OllamaModels OLLAMA_MULTIUSER_CACHE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://*] OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES:]"
time=2024-12-04T10:12:42.578+08:00 level=INFO source=images.go:753 msg="total blobs: 6"
time=2024-12-04T10:12:42.580+08:00 level=INFO source=images.go:760 msg="total unused blobs removed: 0"
time=2024-12-04T10:12:42.582+08:00 level=INFO source=routes.go:1248 msg="Listening on 127.0.0.1:11434 (version 0.4.7)"
time=2024-12-04T10:12:42.584+08:00 level=INFO source=common.go:49 msg="Dynamic LLM libraries" runners="[cpu cpu_avx cpu_avx2 cuda_v11 cuda_v12 rocm]"
time=2024-12-04T10:12:42.584+08:00 level=INFO source=gpu.go:221 msg="looking for compatible GPUs"
time=2024-12-04T10:12:42.585+08:00 level=INFO source=gpu_windows.go:167 msg=packages count=1
time=2024-12-04T10:12:42.585+08:00 level=INFO source=gpu_windows.go:214 msg="" package=0 cores=4 efficiency=0 threads=8
time=2024-12-04T10:12:43.371+08:00 level=INFO source=types.go:123 msg="inference compute" id=GPU-067814a7-6a27-5d65-45e8-f8ad0239b5a2 library=cuda variant=v11 compute=6.1 driver=11.4 name="NVIDIA GeForce MX150" total="2.0 GiB" available="1.6 GiB"
[GIN] 2024/12/04 - 10:13:02 | 404 | 0s | 127.0.0.1 | POST "/api/generate"
Expected behavior
Get-ChildItem "C:\path\to\directory" or ls "C:\path\to\directory"
Actual behavior
@ollama> How do I list files in a given directory
ERROR: Agent failed to generate a response: Response status code does not indicate success: 404 (Not Found).
at System.Net.Http.HttpResponseMessage.EnsureSuccessStatusCode()
at AIShell.Ollama.Agent.OllamaChatService.GetChatResponseAsync(IStatusContext context, String input, CancellationToken cancellationToken) in
D:\AIShell-main\shell\agents\AIShell.Ollama.Agent\OllamaChatService.cs:line 72
at AIShell.Ollama.Agent.OllamaAgent.<>c__DisplayClass23_0.<<ChatAsync>b__0>d.MoveNext() in D:\AIShell-main\shell\agents\AIShell.Ollama.Agent\OllamaAgent.cs:line 114
--- End of stack trace from previous location ---
at Spectre.Console.Status.<>c__DisplayClass17_0`1.<<StartAsync>b__0>d.MoveNext() in /_/src/Spectre.Console/Live/Status/Status.cs:line 120
--- End of stack trace from previous location ---
at Spectre.Console.Progress.<>c__DisplayClass28_0`1.<<StartAsync>b__0>d.MoveNext() in /_/src/Spectre.Console/Live/Progress/Progress.cs:line 133
--- End of stack trace from previous location ---
at Spectre.Console.Internal.DefaultExclusivityMode.RunAsync[T](Func`1 func) in /_/src/Spectre.Console/Internal/DefaultExclusivityMode.cs:line 40
at Spectre.Console.Progress.StartAsync[T](Func`2 action) in /_/src/Spectre.Console/Live/Progress/Progress.cs:line 116
at Spectre.Console.Status.StartAsync[T](String status, Func`2 func) in /_/src/Spectre.Console/Live/Status/Status.cs:line 117
at AIShell.Kernel.Host.RunWithSpinnerAsync[T](Func`2 func, String status, Nullable`1 spinnerKind) in D:\AIShell-main\shell\AIShell.Kernel\Host.cs:line 386
at AIShell.Ollama.Agent.OllamaAgent.ChatAsync(String input, IShell shell) in D:\AIShell-main\shell\agents\AIShell.Ollama.Agent\OllamaAgent.cs:line 112
at AIShell.Kernel.Shell.RunREPLAsync() in D:\AIShell-main\shell\AIShell.Kernel\Shell.cs:line 642
Error details
ERROR: Agent failed to generate a response: Response status code does not indicate success: 404 (Not Found).
at System.Net.Http.HttpResponseMessage.EnsureSuccessStatusCode()
at AIShell.Ollama.Agent.OllamaChatService.GetChatResponseAsync(IStatusContext context, String input, CancellationToken cancellationToken) in
D:\AIShell-main\shell\agents\AIShell.Ollama.Agent\OllamaChatService.cs:line 72
at AIShell.Ollama.Agent.OllamaAgent.<>c__DisplayClass23_0.<<ChatAsync>b__0>d.MoveNext() in D:\AIShell-main\shell\agents\AIShell.Ollama.Agent\OllamaAgent.cs:line 114
--- End of stack trace from previous location ---
at Spectre.Console.Status.<>c__DisplayClass17_0`1.<<StartAsync>b__0>d.MoveNext() in /_/src/Spectre.Console/Live/Status/Status.cs:line 120
--- End of stack trace from previous location ---
at Spectre.Console.Progress.<>c__DisplayClass28_0`1.<<StartAsync>b__0>d.MoveNext() in /_/src/Spectre.Console/Live/Progress/Progress.cs:line 133
--- End of stack trace from previous location ---
at Spectre.Console.Internal.DefaultExclusivityMode.RunAsync[T](Func`1 func) in /_/src/Spectre.Console/Internal/DefaultExclusivityMode.cs:line 40
at Spectre.Console.Progress.StartAsync[T](Func`2 action) in /_/src/Spectre.Console/Live/Progress/Progress.cs:line 116
at Spectre.Console.Status.StartAsync[T](String status, Func`2 func) in /_/src/Spectre.Console/Live/Status/Status.cs:line 117
at AIShell.Kernel.Host.RunWithSpinnerAsync[T](Func`2 func, String status, Nullable`1 spinnerKind) in D:\AIShell-main\shell\AIShell.Kernel\Host.cs:line 386
at AIShell.Ollama.Agent.OllamaAgent.ChatAsync(String input, IShell shell) in D:\AIShell-main\shell\agents\AIShell.Ollama.Agent\OllamaAgent.cs:line 112
at AIShell.Kernel.Shell.RunREPLAsync() in D:\AIShell-main\shell\AIShell.Kernel\Shell.cs:line 642
Environment data
PS C:\Users\54718> $PSVersionTable
Name Value
---- -----
PSVersion 5.1.18362.752
PSEdition Desktop
PSCompatibleVersions {1.0, 2.0, 3.0, 4.0...}
BuildVersion 10.0.18362.752
CLRVersion 4.0.30319.42000
WSManStackVersion 3.0
PSRemotingProtocolVersion 2.3
SerializationVersion 1.1.0.1
Version
terminal: window powershell and powershell 7.4.6, model: codellama:latest