What a great solution. They can leverage this to avoid liability for their own price quotes and solutions while simultaneously adding another layer of vender lock-in. synergy
cowsandmilk 5 hours ago [-]
How does this avoid liability for anything?
mcintyre1994 5 hours ago [-]
I don't think anyone's really stated this outright, but large companies must believe they're not liable for anything their models/AI products are producing. That must be the case for their business model to work.
neuroelectron 3 hours ago [-]
A better question is how can they be held liable at all.
_pdp_ 8 hours ago [-]
On a related note, I'm not sure when it became "ok" to leave production credentials scattered across your system in configuration files. So many MCP server examples encourage this pattern, and inevitably, it's going to cause trouble at some point.
Game_Ender 4 hours ago [-]
What is your preferred way to manage them?
nel-vantage 17 minutes ago [-]
Vantage engineer who worked on this feature here. The security posture of MCP servers is still in its early stages (see “The ‘S’ in MCP Stands for Security” from three weeks ago [https://elenacross7.medium.com/%EF%B8%8F-the-s-in-mcp-stands...]). The recommendations above to use something like the 1Password CLI wrapper when invoking an MCP server seem sound.
That being said, an easier-to-distribute user experience would be to leverage short-lived OAuth tokens that LLM clients such as Claude or Goose ultimately manage for the user. We’re exploring these avenues as we develop the server.
devenjarvis 3 hours ago [-]
The 1pass CLI is great! However if you aren’t using 1password as your secrets vault, I’m building an open source, vault-agnostic alternative called RunSecret (https://github.com/runsecret/rsec)
mdaniel 2 hours ago [-]
You may want to do your own Show HN about it, so folks don't have to be "MCP curious" to find out that it exists
That is extra weird when thinking about the audience who might be Vantage.sh users (and thus have the ability to create the read-only token mentioned elsewhere) but would almost certainly be using it from their workstation, in a commercial context. Sounds like you're trying to keep someone from selling your MCP toy and decided to be cute with the licensing text
bluck 12 hours ago [-]
I'm just trying to understand licenses, but doesn't the choice of MIT contradict the inital "non-commercial purposes" as MIT says 'including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software' - Therefore, the non-commercial purposes is actually void and I can use the software to the limits of MIT defines? And because it is already MIT, they can relicense only future software but not this piece anymore?
So if I want to use the software I just have to create a fork on my home machine for non-commercial purposes, update the license to MIT only, and then the fork is mine to do with as I want commercially? What's even the point of this license?
andrenotgiant 5 days ago [-]
What's the difference between connecting an LLM to the data through Vantage vs directly to the AWS cost and usage API's?
StratusBen 5 days ago [-]
A few things.
The biggest is giving the LLM context. On Vantage we have a primitive called a "Cost Report" that you can think of as being a set of filters. So you can create a cost report for a particular environment (production vs staging) or by service (front-end service vs back-end service). When you ask questions to the LLM, it will take the context into account versus just looking at all of the raw usage in your account.
Most of our customers will create these filters, define reports, and organize them into folders and the LLM takes that context into account which can be helpful for asking questions.
Lastly, we support more providers beyond AWS so if you wanted to merge in other associated costs like Datadog, Temporal, Clickhouse, etc.
cat-whisperer 5 days ago [-]
This is going to different, as resources end up getting intertwined? or is there a way to standardize it?
That being said, an easier-to-distribute user experience would be to leverage short-lived OAuth tokens that LLM clients such as Claude or Goose ultimately manage for the user. We’re exploring these avenues as we develop the server.
That said, given https://github.com/runsecret/rsec#aws-secrets-manager presumably in order to keep AWS credentials off disk one would then have to have this?
in contrast to the op binary that is just one level of indirection, since they already handshake with the desktop app for $(op login) purposesThat is extra weird when thinking about the audience who might be Vantage.sh users (and thus have the ability to create the read-only token mentioned elsewhere) but would almost certainly be using it from their workstation, in a commercial context. Sounds like you're trying to keep someone from selling your MCP toy and decided to be cute with the licensing text
The biggest is giving the LLM context. On Vantage we have a primitive called a "Cost Report" that you can think of as being a set of filters. So you can create a cost report for a particular environment (production vs staging) or by service (front-end service vs back-end service). When you ask questions to the LLM, it will take the context into account versus just looking at all of the raw usage in your account.
Most of our customers will create these filters, define reports, and organize them into folders and the LLM takes that context into account which can be helpful for asking questions.
Lastly, we support more providers beyond AWS so if you wanted to merge in other associated costs like Datadog, Temporal, Clickhouse, etc.