Make Your Requirements Executable


You can now create an instant app from your schema, and add spreadsheet-like expressions for business logic – a complete system in minutes. In this article, we review a new technology from Espresso Logic that makes your requirements – schemas and logic - into working software, and show an example of building a full application from scratch.

Building Database Applications is slow, complex

Sponsored by:

Espresso Logic
There is a significant gap in going from Analysis –Schemas and business logic – to a working system. There are various reasons for this large gap including the following:


  • Speed and Complexity - Business Users expect things to be as fast and simple as a spreadsheet. They are not. In fact, it is snail-slow, with rocket-science complexity – even for just a quick prototype.
  • Transparency - Business Users cannot read code, so cannot verify correctness until they are presented with a working system


Executable Requirements - What is it

Directly executable requirements would address our issues for speed, complexity and transparency. We introduce two technologies: Executable Schema and Reactive Programming:

As shown below, these map directly to key Analysis outputs: database design and formalized business logic. So, this approach fits well with your current process:

  • Executable Schema – the system constructs a fully running multi-screen application from your schema, perfect for business user collaboration to confirm the data model and discover logic requirements. No coding – it just runs.

  • Reactive Programming – transactional update requirements are often expressed as rules. The system directly executes spreadsheet-like rule expressions that address multi-table derivation and validation behavior, providing optimized SQL, transaction handling and dependency management.

In the following sections, we illustrate the process of using Executable Requirements:

  1. Develop initial schema
  2. Create a working application
  3. Collaborate
  4. Respond to Changes

Step 1 - Initial Schema

Devise your schema using your usual procedures and tools. Let’s use the following very simple example:

We need to control the Campaign Costs for Promotions and Conferences

Step 2 - Create Working Application

Try as we might, effective communication with business users is difficult. But one thing that always works is showing users working screens.

Once you point Espresso to your database, it reads your schema and creates a working user interface – an app from your schema; referred to an Executable Schema. Right out of the box, the app provides:

  • Search Grid on the left that provides sort, multiple field search on any number of tables. Results are fully paginated. The selected table in the example is Campaign

  • Master-detail records with forms on the top right and details on the bottom right. The example shown is for Campaign1 with Conferences and Promotions as its children, derived from foreign key relationships

  • Update support, including updatable grids (see the Conference Name)

Customize which fields are shown along with their descriptions and format (not shown in screen below).

Step 3 - Collaborate

Instantly available screens facilitate 2 kinds of discussions with business users:

  1. Confirm Data Model - identify missing attributes (“it’s missing Promotion and Conference Costs”), and entities (“wait – I forgot to tell you about Conference Gifts”).

  2. Elicit Behavior - the screens support update, which leads to the discussion “what’s supposed to happen when I press save”? We might uncover a behavior like this:

“We need to Control Campaign budgets”

General objectives like this need to be captured, and formalized into more precise terms. This process does not change with Executable Requirements – we do exactly what we've always done to formalize the requirements: a step-wise definition of terms, with a dialog something like this:

What is meant by control budgets?  Promotions Cost plus the Conferences Cost cannot exceed the Budget 
What is the Promotions Cost?
Sum of the Promotion.Costs for that Campaign
What is the Campaign Conferences Cost? Sum of the Conference.Costs for that Campaign
What is the Conference Cost? Fee plus the Gifts Cost
What is the Gift Value? Sum of the Gift.Value
What is the Gift Value? Count * Gift’s Inventory.Cost

So, how do we turn these requirements into working software?

Step 4 – Respond to Change

Collaboration facilitated by working software has identified changes. Such changes may seem simple to End Users, but they affect the Data Model, the User Interface, and the update Behavior. We respond as described below.

Revised Data Model

We use our usual database tools to add cost rollups, and the Gift/Inventory:

Now, we need to reflect these changes on our screens, and build the behavior for save, as discussed below.


Since the screens are derived from the schema, they reflect changes to the schema – automatically. So, our new Cost attributes show up, as does the capability to “drill down” to see related Gift data:

Updateable Grids and Foreign Key Lookups are standard patterns of data maintenance applications, and created automatically.

In the above diagram, the column Name actually displays the Inventory Name rather than the IdInventory, which is the foreign key. The system recognized this common pattern (numeric foreign keys, e.g., IdInventory), and created an automatic join to a more suitable column (the Name), automatically. This novel approach makes the app much more useful, out of the box.

Logic: Spreadsheet-like Reactive Expressions

From above, we captured the following specification, clear to both IT and Business Users:

What is meant by control budgets?  Conferences Cost plus the Promotions Cost cannot exceed the Budget 
What is the Promotions Cost?
Sum of the Promotion Costs for that Campaign
What is the Campaign Conferences Cost? Sum of the Conference Costs for that Campaign
What is the Conference Cost? Fee plus the Gifts Cost
What is the the Gift Value? Sum of the Gift Value 
What is the Gift Value?  Count * Gift’s Inventory Cost


But it is far more than a specification – the right column is executable Reactive Expressions for business logic. Let’s take a closer look at this meaningful new technology.

Update business logic is a significant element of the database systems. In a conventional approach, it is tedious to code, requiring significant amounts of change detection and propagation logic, SQL handling (with caching for performance), etc. And it is far too complex for business users, so does not facilitate collaboration.

We need to raise the abstraction level to make such logic substantially more expressive - simple and easy enough for Business Users to collaborate, yet powerful enough to address complex requirements. We’ve actually seen such a revolution in expressive power before: the spreadsheet. The core idea is remarkably simple and powerful: associate expressions with cells, and recompute when the referenced data changes.

Called Reactive Programming, we apply this spreadsheet paradigm to database transaction logic:

  1. Declare expressions to derive database columns (Campaign.PromotionsCost := sum(Promotions.Cost). These are exactly analogous to spreadsheet cell formulas.

      Most interesting transactions are multi-table, so expressions must be able to reference related data. This enables - obligates - the system to take responsibility for not only the change detection, but also persistence (reading / writing related data, caching, optimizations, etc). Ideal implementations would provide flexibility in making aggregates stored (faster) or virtual (when the schema cannot be changed).
  2. Database updates are watched for changes in referenced values, which trigger reactions (e.g., adjustments) to the referencing value
  3. Validations (budget >- conferenceCost + PromotionsCost) are provided as well.

So, we simply enter this logic directly via a Browser-based GUI (this example requires no code, no IDE), which is then summarized for us like this (order is irrelevant, since the system orders execution by dependencies):

Such logic is fully executable. For example, as a Gift is entered, the Reactive Programming runs:

  1. Watches for the insert, and fires the value rule (retrieving the Inventory Cost)
  2. Watches the value - its change fires the rule to adjust Conference.GiftCost (the system figures out the sqls, caches as appropriate, etc). Such chaining (1 rule triggers another) enables simple rules to be combined to address complex transactional logic - just as in a spreadsheet
  3. Watches the GiftCost - its change adjusts the Cost,
  4. Watches the Cost – its change adjusts Campaign.ConferencesCost
  5. Watches the ConferencesCost, and checks the validation. If Budget is exceeded, the entire transaction is automatically rolled back, and an exception is returned to the user

Automatic Reuse: less work, more quality

Our created screens, of course, can be used for multiple Use Cases. Consider moving a Gift to a different Inventory item using the Lookup function mentioned above.

In “manual programming”, it’s not easy to spot all changes our logic must address. We might account for changes in Gift.Count, but overlook the implications for changes to Gift.InventoryId. This is a serious bug: compromised data integrity.

Reactive is different: the system watches for (all!) changes in referenced data, and takes the appropriate reaction. So we are already done: our Gift.Value rule accounts for the change to the Inventory Foreign Key.

This is a big idea: automatic re-use. By encapsulating our logic directly into the data model attributes (rather than coded for each specific Use Case), our solution has actually solved 15 Use Cases (add, change, delete for each of the 5 tables), without error. Not only is it less work, it’s higher quality.

Underlying Technology: REST, Mobile, JavaScript, Security

While this article focuses on driving analysis into working software, the underlying system architecture has several aspects that are attractive from a technology perspective.

REST is used to communicate from the application to the server, for data access and logic.
REST is an attractive Web Services architecture that is network efficient, and an excellent basis for application integration.

The application is JavaScript/HTML5, so runs on mobile devices like small form factor tablets or phablets.

While we focus here on declarative behavior, JavaScript is also available as required. JavaScript popularity is growing rapidly, as the one language acceptable to all mixed-technology teams. It provides a perfect way to address complex logic and application integration, such as sending messages to other systems, mashing up multiple services, etc.

Another key requirement is security. Beyond familiar endpoint access, the system must enforce security at the row and column instance level. For example, you might want Sales Reps to see only their own orders.

Many systems provide such functionality in views, but that leads to a proliferation of views to define and maintain. A better approach is to encapsulate the security into the table as a role-based permission (predicate, below), and ensure it is reused across Multi-Table Resources.

@{current_employee} designates a database row associated with a user, so that its columns (e.g., employee_id) can be used in filter expressions.


So, let’s recap what we've done:

  1. Developed initial schema.
  2. Created a working application with zero effort – just by connecting.
  3. Collaborated to gather data model changes and save behavior.
  4. Responded to changes, automatically for the screens, and with 6 rules for behavior. These mapped directly to our formalized analysis.

The Executable Requirements approach may have seemed simple - it is - but it belies that actual amount of work delivered:

  • A complete web / mobile app
  • Multi-table transactional logic behavior addressing 15 Use Cases, with assured re-use / quality, and transparent to Business Users
  • An enterprise-class architecture, providing a REST-based platform for integration and Custom App Development

The entire process took under an hour - the logic required perhaps 5 minutes, the rest of the effort was data modeling, work required for any approach. By contrast, a conventional approach would have required weeks.

While designed for enterprise-class production, you may find considerable value in using it simply for prototyping and requirements clarification. Collaboration and rapid response to change has significant business value, in delivering to today’s requirements rather than yesterday’s misunderstanding.

But what may be the most exciting is that this technology empowers a significantly wider set of professionals to imagine a system - and deliver it in remarkably short timescales - than ever before.

Our company has been executing on this vision, and now provides the elements described above. If you’d like to check out this technology, there’s a free eval with zero install (it’s a service) that you can try on your own database. We also provide an in-premise version to address high-security requirements.

And we’re always eager to exchange views about the technology - get in touch!


Author: Val Huber, Chief Technology Officer, Espresso Logic

Val has over 20 years of experience building technology to accelerate the development and maintenance of business applications. Building on key learnings from the past as well as leveraging the momentum of cloud computing, Val spent the last two years perfecting the technology to deploy business logic and security on REST and SQL in the cloud. Val’s goal in starting Espresso Logic is to offer a truly agile way for mobile and web developers to build backend applications. Prior to this, Val was the CTO of Versata; a company that was one of the top 5 IPOs of 2000 and was initially funded visionaries such as Paul Allen and Hasso Plattner of SAP. At Versata, he led the development of a J2EE development environment for large-scale transactional and workflow applications. Versata supported tens of thousands of concurrent users and was deployed by many Fortune 500 firms including ADP, Equifax, Fidelity, IBM and Sears.

Val worked as an Architect at Sybase (now part of SAP) before Versata. He was also the driving force behind Wang’s highly regarded PACE (Professional Application Development Environment). PACE provided a relational DBMS, with a fully automated tool suite for reports and interactive applications. Val holds various patents and has enjoyed considerable commercial success of the systems developed by his teams.

Like this article:
  13 members liked this article


reactive-analyst posted on Monday, July 28, 2014 12:33 PM
It is interesting indeed if it empowers lot of business analysts like me to actually create more of the application rather than waiting for the programmers.

What are the pitfalls I should be aware of?
Val Huber posted on Monday, July 28, 2014 2:59 PM
Good question, reactive-analyst. This is Val, the author...

The common fears about higher level languages are
* performance
* can they address complex problems (or get you 80% there and the last 20% is dreadful)
* fit into Enterprise architecture (is it an odd-ball technology)

We invested significant engineering into performance, with techniques like latency reduction, caching, sql optimizations (1 row updates instead of aggregates), and pruning. More detail here:

Complexity is handled by
* the surprising power of the rules (e.g., a Bill of Materials in 5 rules - see
* JavaScript - you can do pretty much whatever it takes

Architectural fit is address by using the RESTful architecture, that enforces your logic during update requests. You can invoke it from pretty much any client / other system, and the JavaScript logic enables you to invoke other services - REST, SOAP, etc.

But the real test is in using it - there's a free trial available in the links in the article.
Zach posted on Monday, July 28, 2014 5:10 PM
Our company will not put any data in the public cloud services - do you have any solution for people who want to deploy this solution behind their firewall
Val Huber posted on Monday, July 28, 2014 10:59 PM
Yes, we have heard that concern often, so now offer a soft appliance (VMware image) which you can run inside your firewall. It's a common configuration for us.

It's also useful on laptops/desktops for evaluation on your corporate databases.
Only registered users may post comments.


Upcoming Live Webinars


Copyright 2006-2024 by Modern Analyst Media LLC