Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Blog: Arize comparison - added keywords #2624

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

Blog: Arize comparison - added keywords #2624

wants to merge 1 commit into from

Conversation

LinaLam
Copy link
Collaborator

@LinaLam LinaLam commented Sep 14, 2024

No description provided.

Copy link

vercel bot commented Sep 14, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
helicone ✅ Ready (Inspect) Visit Preview 💬 Add feedback Sep 14, 2024 0:14am
helicone-bifrost ✅ Ready (Inspect) Visit Preview 💬 Add feedback Sep 14, 2024 0:14am
helicone-eu ✅ Ready (Inspect) Visit Preview 💬 Add feedback Sep 14, 2024 0:14am

@@ -100,31 +107,61 @@ Arize Phoenix is an open-source LLM observability tool that focuses on providing

Arize Phoenix excels in its evaluation capabilities and is well-suited for data scientists and ML engineers working on complex LLM projects. However, it lacks some of the developer-friendly features that Helicone offers, such as self-hosting options, user tracking, and user feedback collection. Arize Phoenix's pricing model may also be less flexible compared to Helicone's tiered approach.

## Why Companies Choose Arize Phoenix Over Helicone?

- **Robust Evaluation Capabilities**: Ideal for data scientists focused on model performance.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We have evals too!

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

but is it the same type of eval as we do?

https://docs.arize.com/phoenix/evaluation/llm-evals

## Why Companies Choose Arize Phoenix Over Helicone?

- **Robust Evaluation Capabilities**: Ideal for data scientists focused on model performance.
- **Integration with ML Workflows**: Seamlessly fits into existing machine learning pipelines.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe just keep this one

Both Helicone and Arize Phoenix offer powerful features for LLM observability, but they cater to slightly different audiences.

### Choose Helicone if you:
* Require self-hosting options for data control and compliance.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, we also have SOC2, the ability to omit storing bodies

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what do you suggest here?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If they're concerned about control and compliance, Helicone has SOC2, the ability to omit storing bodies and self-hosting options

Both Helicone and Arize Phoenix offer powerful features for LLM observability, but they cater to slightly different audiences. Helicone's user-friendly approach, comprehensive feature set, and flexible pricing make it an excellent choice for a wide range of users, from solo developers to small and medium-sized teams. Its self-hosting options and advanced features like user tracking and feedback collection give it an edge in many scenarios.
### Choose Arize Phoenix if you:
* Are a data scientist or ML engineer focused on model evaluation.
* Need advanced tools for assessing LLM performance.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, we have evals too!

Ultimately, the choice between Helicone and Arize Phoenix depends on your specific needs, team size, and the complexity of your LLM applications. For most users, especially those looking for an all-in-one solution with a gentle learning curve, Helicone appears to be the more versatile and accessible option.
**For most users**, especially those looking for an all-in-one solution with a gentle learning curve and features like self-hosting, user tracking, and flexible pricing, **[Helicone](https://www.helicone.ai/)** is the more versatile and accessible option.

**For data scientists and ML engineers** working on complex LLM projects who require advanced evaluation capabilities and integration into existing ML workflows, **[Arize Phoenix](https://phoenix.arize.com/)** is preferable.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here with evals. we do that too!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants