Jump to content

What if I don't want to or if I simply can't share my data with any outside organization like OpenAI / Microsoft? Do you plan a local deployment of the copilot in the future, too? Or do we have to wait until ChatGPT can be deployed locally?


Fabian Duerr

Recommended Posts

It is important to note that everything in the Azure architecture happens in your Azure subscription. Therefore, Microsoft will not have access to the data nor will the data be used to improve the foundation model; this is very similar to other Azure services. This is generally in contrast with OpenAI's terms. Let me break down the answer into two scenarios depending on what you mean by "share with":

  1. If you mean you need your organization to keep the data private, it will be no problem. Similar to other Azure services, your data will remain your data. The foundation model will happen to run on Azure and in your instance only to operate on your data. The data will not be shared with Microsoft or other organizations.
  2. If you mean you cannot or don't want to send your data to any third-party platform whatsoever, that may be an issue today. There are only a few public LLM services out there which will require data to be sent to them.
Link to comment
Share on other sites

Thanks for the clarification Ahmad. It's more about the second option (no cloud solution). The current speed of development in LLM is breath taking. It's probably just a matter of time until some open source models (like MPT-7B, Falcon etc) deliver very good results. It would be great if Tibco could provide finetuned open source models for Spotfire in the future (for scripting, data functions and so on)

It is a very exciting development. I'm looking forward to learn more about the copilot.

Link to comment
Share on other sites

Well, the good news is that we don't do fine tuning. We do prompt engineering to provide the right context and instructions to the foundation model. The architecture is modular, so if and when you prefer to use any LLM endpoint we can simply use the exact same architecture but point it to your LLM of choice instead of Azure or OpenAI.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...