Jump to content

Message being replaced on StreamBase

João Dessain Saraiva

Recommended Posts

It's difficult to diagnose your issue with any certainty, as you haven't provided very much information about your application.

Let's review what you have stated:


You are using StreamBase

You consuming messages from an EMS queue

You want them to "increment" rather than being replaced

You've shown us a screenshot from a Spotfire table visualization.

You chose Spotfire Data Streams as the product tag for this post


What you haven't said is how EMS messages are getting into the visualization, or what the properties and parameters of the visualization are. And it's unclear what you mean by "increment.

Perhaps we can assume that the visualization is being created this way:


the stream of tuples being emitted from the EMS Consumer operator is being published into a LiveView data table

the visualization in the screenshot is using the Spotfire Data Streams connector as a data source to query against that LiveView data table to create a Spotfire table visualization


My guess is that what you are hoping for is that each tuple is inserted into the LiveView table, and that the visualization shows all the rows of the table, and updates as new tuples are published into the table

If all these guesses are correct, then my first thought is that the primary key on the LiveView table is such, and the values in the tuples in the primary key fields are such, that the each tuple published to the table is updating a single row rather than inserting a new row. Primary key field values have to be unique.

Another possibility is that the table visualization in Spotfire has a query associated with it that is somehow showing only one row.

But without more detailed information about how you have created your application from end to end, and what your data and schemas are, these are just guesses.

If I were going to try to debug this myself, I would probably take a look at the contents of the stream of tuples coming out of the EMS Consumer before the stream is published to the table, and then at the schema and primary key of the LiveView data table where the tuples are being stored. I might look at the table using a SELECT * FROM query either in LiveView Web or using lv-client to take a more direct look at the table contents to try to isolate where your expectations are first being violated.

Another thing to take a look at: I have noticed that sometimes when null values are being published into LiveView data tables as primary key field values that the table stops having new rows inserted. I have noticed this especially when using the LiveView Publish Adapter. I haven't taken the time to figure why this is (IMO that would be a bug in LiveView), as null should be a valid primary key value, but I noticed that when I filter out any null primary key values, then the LiveView data table behaves more reasonably.

I hope this answer is helpful; please feel free to provide more details about your situation if that would help.

Link to comment
Share on other sites

I have found, in general, when people ask short questions (not much detail provided), I end up giving long answers, because I have to do a lot of guesswork and there are many possible ways to do things with TIBCO Streaming. So when people take the time to ask long questions (more detail provided), I can often give much shorter -- and often much more definitive -- answers. :-)
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Create New...