Jump to content

Creating a Liveview Data Table, before creating a named schema and Eventflow module


Sharad Honavar

Recommended Posts

Hello,

StreamBase 10.6: The Liveview Development docs instructionto create a Liveview Project starts with a Liveview Data Table creation through wizard,and then a Data Source wizard in order to auto create the Eventflow module(.sbapp) and interfaces to feed the Data Table.However the Data Table wizard forces me to type in a local schema, which is not what I want.The Choose button in the LiveviewData Table wizardis empty if there's no Eventflow module w/named schema. Seems like a circular conundrum..How do I get around this I want the auto Interface creation feature of the Data Source wizard and avoid manual creations.

 

Thank You

Link to comment
Share on other sites

A bit confused by your question. The definition of the new table's fields has to come from somewhere. Where do you want your field definitions to come from, initially Your choices are to type them in manually or import them from somewhere else, whether that is "outside" Studio (like a CSV or XML file) or from some other StreamBase project. You don't want to type in your schema manually (I get it), but Studio doesn't force you to -- you can import or infer a schema from a variety of file types. So, then where do your definitions come from

 

Also, I think you may be confusing the schema from which the LV table fields might come from with the interfaces that LiveView knows how to auto-generate from the table schema Once a table has a schema, then you can always right click on the table's lvconf and do > StreamBase > Generate LiveView Schemas, which, for a table named x will generate an lvinterfaces/xSchemas.sbint whenever you want. Or if you AREN'T confused about that, maybe I misunderstood your question, so feel free to clarify or further elaborate what you are wanting to do.

Link to comment
Share on other sites

Hi sbarber,

 

Sorry I should have been more explicit, I actually connect a Query Operator to a jdbc adapter and grab the DB (MS SQL server) schema into the Eventflow sbapp using "Execute query and populate fields", which I love, into the Query Operator -> Result Settings whose Ouput port is then connected to the Output Stream and passes All Ouptut SQL Result defs to it.

 

After that, when I create a Liveview table. I cannot (or donot know how to)  somehow pull that schema into the Liveview Table schema. Except by way of creating a Named schema in Eventflow using Copy Schema  From->Existing Schema -> IntermediateStreams and select Output Stream of the Query Table. OR Refactor the Output Stream of the Query operator into a Named Schema. And then using that for the LV Table defs by Choose -> App+Schema.

 

Both create an unnecessary extra Named schema just for Liveview Table definition copy later, which then creates a similar one for an Table Liveview Interfaces in Eventflow. 

 

Is there a way without creating a Named Schema just for a Liveview Data Table definition (I also Exported the Named SChema into a json to later copy into LV Table definitiopn feature, which was unneccesary)

 

Thank You

 

 

 

 

 

 

 

   

 

 

Link to comment
Share on other sites

This isn't really an answer to your question, more of a discussion, but we can at least have that now that your use case is somewhat clearer via the comments.

I can't reproduce the behavior you describe that Copy Schema from an intermediate stream generates an extra named schema somewhere that will never be used again. (I don't disbelieve your observations, I just don't see it myself at this point on 10.6.1, but it's hard to know whether we're doing "the same thing" or not.)

 

I just see the stuff in the xTable.sbint file generated, which are certainly used again, if only to typecheck and create at runtime the very table just created and also are there in preparation for work with publisher and application data sources and the like, as well as for the developer's use in a wide variety of situations. I personally find these schemas to be very handy, and not unnecessary at all.

 

As a matter of StreamBase and LiveView application design practice, I tend to go with a rough guideline that ANY schema that is going to be used in more than one place should always have a named schema. So, if that JDBC Read Query operator's result schema is really going to be the basis for your LiveView Data table, the result schema deserves its own named schema in EventFlow, anyway. (Also, if you haven't learned this already, someday you are probably going to want to take that lovely JDBC Query operator's default setting for the SQL Result Fields property from "Result set from SQL Query" to "Explicitly Set Fields Below" from a Named Schema. That is, always querying the result set metadata with every single typecheck that Studio does is a great way to initially not have to type in that schema by hand, but after a while that behavior results in a number of nasty issues (like your app not noticing that your DB schema is different in different deployment environments, or -- ack - one of your developers doing metadata typecheck queries across an ocean and getting a 20 second response time delay every time they change something in the app, or not having TIBCO Support be able to help you debug your code because you didn't give them your SQL for your tables, etc.) These are just my experiences; hope you find them interesting, at least.

 

Now, that said, "Importing" that EventFlow Named Schema for use in a LiveView data table in such a way that there is only ever one Named Schema that ever has that same set of fields -- that sounds ideal to me, but I'm also not sure given the bugs and features I am seeing in Streaming 10.6.1 that it currently can work that way all the time. That is, I think you are running into, or going to run into some issues around all this EventFlow-> LiveView schema sharing that are even bigger than this one you raise now. So some compromises and workarounds and some duplication of schema definitions are going to be a way of life here as a practical matter at least in the near term in order to have a reasonable separation of say, and EventFlow Fragment that feeds a LiveView Fragment and appropriate decoupling of projects and such (not sure if that's what you are doing), again with apologies but that's all out of my hands.

Link to comment
Share on other sites

Hello,

 

Thanks for the great tips on pragmatic methodology guidance. Basically you answered my question, but some clarifications on my dilemna and perhaps a suggestion for future release.

 

I waw not implying that creating a Named schema (using Refactor of the SB Output Stream of the SB SQL Query Operator OR Copy from Intermediate Stream->Existing Schema->From Workspace/Interface->select "SQL_call (port 1)") creates a Named Schema "never to be used again". Of course I deliberately create a Named schema in SB with the intention of using it to Create Liveview Table Wizard, which in turn then creates "same" schemas for it's it's Interfaces to Eventflow - xTable.sbint  (in addition to other schemas).

 

Simply put, If I were to first create a Liveview project from a scratch, which starts me off with the Create Table Wizard wizard, without an a priori existing SB project, as expounded and recommended by the doc on LV Developers Guide->Project Tutorial, and then create a Data Source for it using wizard(again the workflow indicated by the doc , then there is no schema available at this point Named or not, nor can the Liveview wizard grab the external SQL schema .

 

So I had to create an eventflow fragment with the jdbc onstruct+query operator to grab the external DB schema and create a Named schema, then come back to the LV Table Wizard  using rt.click Create LV Table, and pull in the SB Named schema, which created the sbint schemas matching my Named schema. From that point on, I had no use for the Named schema I created initally ,because I could use the the sbint schemas.  Guess it's one of those decoupling things we have to live with, unless the Liveview had it's own jdbc grab and populate feature 

 

ABout the other point, I do use "Explicitly Set Fields Below" "Populate fields from call..." magic. If the Deployment DB has changed since, then I have bigger problems downstream, and Named schemas will also have to be revisited to match, might as well do the Populate magic again..

 

Thank You.

Link to comment
Share on other sites

(I went ahead a deleted your wayward Answer. I feel ya; easy to press the wrong thing in the Community UI.)

I think we're more or less on the same page at this point. I can't tell exactly what your workflow was (English is ambiguous and the UI gives a whole lot of ways to accomplish things that seem similar but aren't exactly the same thing). In theory, you should be able to base everything on single "bottom" schema and derive everything from that with no replication of definition; in practice, the tools may fight you at some point and there may be some bugs.

 

So you have to try different ways of doing things, plan out dependencies carefully, etc. May or may not be worth it, may or may not be possible, depending on your project and your team.

Note: if you have suggestions for product enhancements, leaving them here in a Community Answer won't necessarily or maybe even likely bring them to the attention of TIBCO Streaming Product Management. That's what ideas.tibco.com is for (though certainly not all ideas are implemented).

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...