"I was able to test and tweak my prompts effortlessly with this software. The ability to try out different variations is a game-changer."
"The trade-off between quality, latency, and cost is a thing of the past. This software has taken a lot of burden off my shoulders"
"The ability to create new templates quickly has been a lifesaver. It's now easier for me to tailor solutions for my projects."
"Prompt engineering used to be daunting until I found this software. Its features have helped me optimize my work incredibly."
No. Query Vary is designed to be easy to use for non-technical users. However, for certain advanced settings, such as configuring a vector database, you may need help from your IT department, or you can use the fully managed service of Query Vary, where we help you develop your LLM-powered application.
Yes. You can upload documents directly from Query Vary or connect your vector database if you have a lot of data to add.
Yes. Query Vary has SOC2 Type 2 certification. As part of that, we encrypt your API keys using AES-256 encryption. Data uploaded to Query Vary is segregated into designated 'collections' for each organization. If desired, you can self-host your database and only connect it to Query Vary for data processing.
Free users receive 250 LLM answers included, and paid users have $20 worth of LLM credits included. You can bring your own LLM keys if you wish to consume more or if you’d like to separate billing for your company.
Yes. Please contact us to discuss your requirements.
Yes. Query Vary supports image interpretation models such as GPT-4 Vision and Claude 3. You can also generate and compare images from image generation models such as DALL-E 3 and DALL-E 2.
Query Vary continuously adds the latest models from OpenAI, Anthropic, Google, Azure, and other providers. You can assume the latest state-of-the-art models are available. If you have a self-hosted open-source model, you can connect it to Query Vary through the API. Please contact us.
Yes. Query Vary allows you to chain prompts together, whereby the output of one LLM is used as input to the next. This way, you can combine the powers of different models and increase robustness, latency, and reduce costs.
Yes, you can build any application you want on top of Query Vary’s LLM evaluation suite. Please have a look at our documentation for more information.