Scaling to large: > 100.000 model elements
How about scalability? From time to time I hear this question when showing various examples of domain-specific modeling and code generation. It is also a bit hard question to answer as scalability can mean different things, like size of a diagram, number of model elements, depth of model hierarchies, number of languages used in parallel, number of concurrent engineers and obviously speed of the tool when working in the "large scale". I recorded a short session opening and working with something that can be called large: tens of different languages, thousands of diagrams, and hundreds of thousands model elements.
The video shows how MetaEdit+ performs: opening and working in large project look pretty much the same than when working in a small project.
Tool is obviously one, and important part when providing scalability, but perhaps the right question would be: how to define the modeling languages that acknowledge scalability?