MetaCase Homepage
Forum Home Forum Home > > MetaEdit+
  New Posts New Posts RSS Feed - Recommendations/experiences wrt UI mockup tools ?
  FAQ FAQ  Forum Search   Events   Register Register  Login Login

Recommendations/experiences wrt UI mockup tools ?

 Post Reply Post Reply
Author
Message
verlsnake View Drop Down
Member
Member
Avatar

Joined: 11.Feb.2009
Location: Paderborn, Northrhine-Westf
Points: 4
Post Options Post Options   Thanks (0) Thanks(0)   Quote verlsnake Quote  Post ReplyReply Direct Link To This Post Topic: Recommendations/experiences wrt UI mockup tools ?
    Posted: 11.Feb.2009 at 17:08
Just recently I have entered the arena of dedicated UI mockup tools; just to name a few:

- DesignerVista (http://www.designervista.com)
- Balsamiq Mockups (http://www.balsamiq.com/)
- Axure RP Pro 5 (http://www.axure.com)
- GUI Design Studio v3 (http://www.carettasoftware.com/)
- iRise (http://www.irise.com/)
- ProtoShare (http://www.protoshare.com/)
- Prototype Composer (http://www.serena.com/products/prototype-composer/index.html)
- Lucid Spec (http://www.elegancetech.com)

Do You MetaCase Users/Evaluators have any recommendations/experiences in this arena ?

One goal could be:
- Describe the UI mockup using one of those dedicated tools
- Then transform the description into a format MetaEdit+ can digest

I suppose I will really have a very hard time when developing a comprehensive modeling language for UIs in MetaEdit+ myself; I mean a comprehensive modeling language which would be comparable to what one can achieve with those dedicated tools.

Is this a viable goal or am I on the wrong track when MetaEdit+ shall play a vital role in such a workflow ?
Back to Top
stevek View Drop Down
MetaCase
MetaCase
Avatar

Joined: 11.Mar.2008
Points: 643
Post Options Post Options   Thanks (0) Thanks(0)   Quote stevek Quote  Post ReplyReply Direct Link To This Post Posted: 11.Feb.2009 at 19:37

There are all manner of interesting ways to use MetaEdit+ to get UI code!

The key difference compared to the UI mockup tools that you mention is the end result. With the UI mockup tools, you get a Word document with pictures of screens and descriptions, to hand to a developer who will program the actual application from scratch. With MetaEdit+, you get a full running application straight from the initial models. The initial models can still be drawn by the UI designers just as easily as with the mockup tools. They won't need programming or complex programming-like structures, just a high level logical view of the UI. The models don't need to be pixel perfect; their layout can approximate what is wanted on screen, or then be intended more to make the logic and flow easier to see. The actual UI is generated, so instead of a simulation of what the screens might look like and how they would be connected, the actual running application can be tested straight off.

Of course all this depends on someone first defining a modeling language and generators for the particular domain. The actual GUI stuff is pretty similar across domains, but the best way to express structures, logic, rules and behavior varies considerably. Let's take a step away from talking about UI mockup tools for a moment, and look how UIs are made with DSM in general.

In some quite common cases, you don't even need to draw a UI at all. Instead, you make a generator that automatically produces the UI definition based on a higher level description in the model. The model wouldn't explicitly say anything about the UI or its layout, focusing instead on the data structures the application will use, the business rules, and the interaction flow. With a simple generator, you get a simple UI: e.g. for each data field defined or implied in the model, you get a label and a GUI widget of the right type, linked to the right data element in your program. With more complicated generators you can do more intelligent layout. If for a particular screen you want to hand-define the GUI, you can mark the relevant graph in MetaEdit+ to use the hand-defined UI definition file rather than try to generate the UI automatically. If the generated UI definition file is in the format used by your IDE's visual UI designer or a 3rd party UI definition tool, you can use that as a starting point.

This approach can be extended by adding more UI information in the models. For instance in the simplest case you may have each graph corresponding to one UI screen, and each data element appears on only one screen. Once you get more complex cases where the same data element appears on more than one screen, you can look at moving to separating the definition of the data elements from their use on particular screens. You might have one modeling language for data elements and another for screens, with the objects in the screen graphs referring to data elements you have defined in the other graphs. The UI can still be automatically generated with simple or more intelligent algorithms from the screen graphs, and the screen graphs need not include detailed information about the layout - maybe just the visual order, or the ability to split the screen into columns or areas. The generator can then lay out the UI components with coordinates and sizes that work for the screen size you envisage; you can generate different versions of the UI for different screen sizes or widget sets, all from the same model. In some cases it may make sense to move the layout algorithm from the generator to a dynamic runtime framework.

In none of the cases so far have the models tried to be pixel perfect, with objects in graphs having the exact visual coordinates of the desired widgets (and certainly not that you'd have to enter X and Y coordinates as properties in the objects). You could try building such a "pixel perfect" modeling language, but you'll find that it's fighting a little against what MetaEdit+ is designed for. E.g. objects in MetaEdit+ are gridded so their centres are on a grid point, and scale so their centres remain fixed; in UI mockup tools gridding is probably for the corners, and scaling will leave the opposite corner fixed. You can work around these by putting the target point of a symbol at the top left (the target point is what MetaEdit+ uses as the "centre"), and by scaling with CTRL held down. For some other similar cases there are hacks or workarounds, for others maybe not, but in any case the result will probably not feel as nice as a dedicated UI mockup tool. If you want basic UI layout in MetaEdit+ to complement other models there it will probably be fine, but otherwise I'd imagine users would prefer a dedicated tool.

So how to integrate with a dedicated UI tool? We've already covered integrating with the IDE's visual UI designer downstream from MetaEdit+, but what about having a UI mockup tool upstream, as you suggest? It's certainly not impossible: you can write a program or MERL generator to read the tool's UI definition files, and create corresponding MetaEdit+ models. There would seem little point in making those models be "pixel perfect" layouts; you already have that task handled well in the UI tool. But you could pick up higher level information like the data elements, if the tool has that information. That could work quite nicely for small cases, but for bigger cases you'll find it hard to integrate several UI definitions and their updates together into an existing set of MetaEdit+ models, especially if data elements are reused across multiple screens. It sounds like a lot of work as you said, and the results would require using two tools; so it would cost more in terms of effort and tools, yet most likely offer less than the normal DSM approach in MetaEdit+.

Back to Top
verlsnake View Drop Down
Member
Member
Avatar

Joined: 11.Feb.2009
Location: Paderborn, Northrhine-Westf
Points: 4
Post Options Post Options   Thanks (0) Thanks(0)   Quote verlsnake Quote  Post ReplyReply Direct Link To This Post Posted: 11.Feb.2009 at 20:21
Hello Steve !

Thank You very much for these very thorough answers :-) ...  Since I am not at all in the trenches currently, I cannot give any further fruitful feedback at this time.

Only so much: Is there a powerful GUI modeling DSL already available for MetaEdit+ ? Commercial or free ? Because picture this: The modern desktop GUI paradigm is/becomes almost ubiquitous; that is when I have a powerful DSL for this GUI paradigm, I can cover pretty much everything GUI know in the (software) universe so far :-) ...


Tschüss and Hei Hei

Kai

Back to Top
stevek View Drop Down
MetaCase
MetaCase
Avatar

Joined: 11.Mar.2008
Points: 643
Post Options Post Options   Thanks (0) Thanks(0)   Quote stevek Quote  Post ReplyReply Direct Link To This Post Posted: 12.Feb.2009 at 12:38

There's currently no generally available modeling language explicitly for the modern desktop GUI paradigm. Of course we have example languages and/or generators that produce such GUIs (e.g. the Web Apps example), but the concepts in those languages are not things like "Text Box", "Checkbox", "Radio Button" etc. So the market for a powerful GUI modeling DSL is currently all yours, Kai!

Back to Top
 Post Reply Post Reply

Forum Jump Forum Permissions View Drop Down

Forum Software by Web Wiz Forums® version 12.05
Copyright ©2001-2022 Web Wiz Ltd.

This page was generated in 0.059 seconds.