2,375 questions
0
votes
0
answers
21
views
Why is the tool prompt provided after tool invocation in an MCP workflow?
I’m working with MCP server tools and have run into a question about the ordering of prompts and tool calls:
In my workflow I use a tool (for example split_task_raw) to decompose a task. After the ...
0
votes
0
answers
32
views
Gemini Nano AICore prompt failing with 11-RESPONSE_PROCESSING_ERROR on Android
I'm trying to run the following prompt on Gemini Nano, the on-device AI.
I'm running the prompt on the sample app Google provided:
TASK: Extract entities from the input text into a single JSON object.
...
-2
votes
1
answer
112
views
How to show timestamp only on Zsh prompt immediately after running a command, not before?
I want to customize my zsh prompt so that when I open the terminal, it shows a normal prompt without a timestamp, like:
ABIR~ ❯
When I type a command (for example, ls), the prompt still shows no ...
1
vote
0
answers
299
views
Failed to use apply_chat_template when adding tools
I failed to use apply_chat_template when using function calling. Is there something I missed?
Example
I downloaded tokenizer.json and tokenizer_config.json for testing function calling to see its full ...
0
votes
1
answer
55
views
Anaconda prompt broken: input line is too long, the syntax of the command is incorrect
After a windows update, I suddenly got this error today when starting the Anaconda Prompt: "input line is too long, the syntax of the command is incorrect". Seems like there is something ...
0
votes
0
answers
67
views
Python multiline prompt
I have been trying to change my prompt in sys.ps1 with PYTHONSTARTUP.
My idea was that I will show some information on first line (time, path) and the prompt on the second. When I tried changing ps1 ...
1
vote
1
answer
395
views
How do I get google/gemma-2-2b to strictly follow my prompt in Hugging Face Transformers?
I'm using the following code to send a prompt to the "google/gemma-2-2b" model via Hugging Face's Transformers pipeline:
from transformers import AutoTokenizer, AutoModelForCausalLM, ...
0
votes
0
answers
296
views
Dot Prompt syntax for defined prompt in Firebase Genkit
Below are the important variables defined in order to call prompt from CLI or Firebase Web.
Schema & Function
export const OperationSchema = z.object({
a: z.number(),
b: z.number(),
});
...
0
votes
1
answer
617
views
invoke_model of boto3 is not accepting the parameter explicitPromptCaching as one of its arguements
I am using invoke_model function of boto3 library for aws bedrock for Claude sonnet 3.5 v2, i am trying to use prompt caching by using the InvokeModel API of bedrock for claude, according to the ...
0
votes
1
answer
441
views
Why do different prompts affect how I can run Python code in VSCode?
In VSCode, I can run Python code from a .py file by selecting the code in the editor then typing shift+enter. It runs without error and opens a Python terminal (prompt turns to >>>). However, ...
0
votes
0
answers
21
views
chat application that works in nopebook environment but does not work on XAMPP and serving using bokeh server
my chat application which uses panels on serving using Bokeh server does not show chat interface.
to me it appears to be some bokeh compatibility with python packages.
to solve I made separate app to ...
0
votes
0
answers
250
views
Amazon Bedrock Prompt Management Variants is not working
I tried to create a prompt with two variants. I could not get it working, even when I mentioned two variants, it only created the first one and completely ignored the other. The sample of the program ...
0
votes
1
answer
16
views
failure to start npx expo start
Set-ExecutionPolicy-Scope : The term 'Set-ExecutionPolicy-Scope' is not recognized as the name of a cmdlet, function,
script file, or operable program. Check the spelling of the name, or if a path was ...
2
votes
1
answer
89
views
How to parse a resume with few shot method using the specified models from HuggingFace and Langchain?
Model selection confusion and some errors while trying to parse a resume with the following codes
Trying to do a few shot prompting with a google flan t5 base model
While Doing so I am getting an ...
0
votes
1
answer
4k
views
Dataset format/ prompt template for fine tuning Qwen 2.5-Coder Instruct
I see that codellama 7B Instruct has the following prompt template:
<s>[INST] <<SYS>>\n{context}\n<</SYS>>\n\n{question} [/INST] {answer} </s>
But I could not find ...
0
votes
1
answer
108
views
zsh error setting prompt, NOT using oh-my-zsh
I am having trouble setting the dracula theme for zsh. I am NOT using oh-my-zsh.
I have downloaded the theme and I have placed it in the correct folder. When I run
~/ prompt -l
dracula adam1 adam2 ...
1
vote
0
answers
329
views
Chat template for any base model
I want to instruct fine tune the base pretrained llama3.1 model. But base model do not have chat template in Huggingface. So can I use chat template of my interest to fine tune the model for my ...
0
votes
0
answers
61
views
Is there any way to prompt the user using powershell script such that the user is not able to ignore it
I am able to give a prompt to the user using powershell script but the issue is that the user is able to ignore it. What I mean is the user is able to open other programs and file even when the prompt ...
0
votes
0
answers
263
views
GCP Gemini API - Send multimodal prompt requests using local video
Hope someone can help.
The sample code shows how to send multimodal prompt requests with locally stored images:
import vertexai
from vertexai.generative_models import GenerativeModel, Part, Image
...
0
votes
1
answer
45
views
User response is unable to be read in script
I am producing a QC check script for my job. In it, I have set it up where you define a year, then it prompts the user with a question about what quality control types and lakes they want to run the ...
0
votes
1
answer
56
views
How to make sure user prompted response equals at least one value from a provided vector?
I am developing a QC check script for my job. I want it to be all within one function, and therefore prompts are asked to the user to determine the arguments used within the function. Here is an ...
0
votes
1
answer
494
views
How to Render Prompt String in Semantic Kernel for Evaluation with Prompt Flow SDK in Python?
I am currently working on a project where I utilize the Semantic Kernel with a plugins directory and multiple skills. My ultimate goal is to develop a GitHu. workflow to evaluate my prompts using the ...
0
votes
1
answer
178
views
how to keep a command window open after run a batch file?
When I run a compilation command from batch file, the command did execute but after it completed, the command window close immediately
I'm expecting the result shows in the command window after ...
1
vote
0
answers
310
views
type error: object of type ChatPromptValue has no len()
I was trying to create a chatbot using a LLM and Chat Prompt Template. But I am facing this error :type error: object of type Chat Prompt Value has no len(). Could someone let me know where this error ...
0
votes
1
answer
15k
views
How to print input requests and output responses in Ollama server?
I'm working with Langchain and CrewAI libraries to gain an in-depth understanding of system prompting. Currently, I'm running the Ollama server manually (ollama serve) and trying to intercept the ...