Wednesday, December 30, 2015

The best way for sharing data between controllers in AngularJS 1.x

You may know the situation in AngularJS 1.x when multiple independent controllers need to share some data. E.g. one controller adds some data that should be available in the other controllers in the same view. So far as I know there are three possibilities:
  • Using $scope, e.g. $scope of a common parent controller or $rootScope
  • Using publish-subscribe design pattern via $emit / $broadcast (fire events) and $on (listen for events)
  • Using services which can be injected in multiple controllers.
The first possibility to share data via $scope is a bad idea. If you share data via scopes, you have to know controllers' parent-child relationship. That means, your controllers are tightly coupled and the refactoring is hard to master. Some AngularJS examples save the data on the $rootScope, but the pollution of $rootScope is not recommended. Keep the $rootScope as small as possible. The event based possibility to share data via $emit, $broadcast and $on is a better approach in comparison to scope based one. The controllers are loosely coupled. But there is a disadvantage as well - the performance. The performance is ok for a small application, but for a large application with hundreds of $on listeners, the AngularJS has to introspect all scopes in order to find the $on listeners that fit to the corresponsing $emit / $broadcast. From the architecture point there is a shortcoming as well - we still need scopes to register $emit, $broadcast and $on. Some people also say - communication details should be hidden for the same reason we keep $http hidden behind a service layer. There are 5 guidelines for avoiding scope soup in AngularJS. The last 5. rule is called Don't Use Scope To Pass Data Around. It advices against using scopes directly and against event based approach. Last but not least - think about the upcomming AngularJS 2. It doesn't have scopes!

In my opinion, the preferred way for sharing data between controllers in AngularJS 1.x is the third possibility. We can use services. A service can keep data or acts as event emitter (example is shown below). Any service can be injected into controllers, so that conrollers still don't know from each other and thus are loosely coupled. I will refactor and extend one example from this blog post. The author of this blog post implemented two controllers for two panes. In the left pane we can input some text and add it as an item to a list in the right pane. The list itself is placed in a service (service encapsulates the data). This is probably a good idea when you have different views or controllers in conditional ng-if. But if controllers exist in the same view and show their data at the same time, the list should reside in the controller for the right pane and not in the service. The list belongs to the second controller. This is my opinion, so I will move the list into the controller and also add a third "message controller" which is responsible for messages when user adds items to the list or removes them from the list. We thus have three controllers and one service. The idea is to apply the listener pattern (also known as observer) to the service. The service acts as event emitter. Every controller can fire an ADD or REMOVE operation and register a listener function to be notified when the operation is done. The added / removed item acts as data passed along with operation into all registered listeners for the given operation. The live example is implemented on Plunker.

The first picture shows the adding process (click on the button Add To List) with a message about the successfully added item.


The second picture shows the removing process (click on a link with x symbol) with a message about the successfully removed item.


The HTML part looks as follows:
<!DOCTYPE html>
<html>
<head>
    <meta charset="utf-8" />
    <meta content="IE=edge" http-equiv="X-UA-Compatible" />
    <script src="https://code.angularjs.org/1.4.8/angular.js"></script>
</head>

<body ng-app="app">
    <div ng-controller="MessageController as ctrlMessage" style="margin-bottom:20px;">
        <span ng-bind-html="ctrlMessage.message"></span>
    </div>

    <div ng-controller="PaneOneController as ctrlPaneOne">
        <input ng-model="ctrlPaneOne.item">
        <button ng-click="ctrlPaneOne.addItem(ctrlPaneOne.item)">Add To List</button>
    </div>

    <div ng-controller="PaneTwoController as ctrlPaneTwo" style="float:right; width:50%; margin-top:-20px;">
        <ul style="margin-top:0">
            <li ng-repeat="item in ctrlPaneTwo.list track by $index">
                {{item}}
                <a href style="margin-left:5px;" title="Remove" ng-click="ctrlPaneTwo.removeItem($index)">x</a>
            </li>
        </ul>
    </div>

    <script src="controller.js"></script>
    <script src="service.js"></script>
</body>
</html>
As you can see, there are three independent HTML div elements with ng-controller directive. The MessageController shows a message above. The PaneOneController keeps an input value and the function addItem(). The PaneTwoController keeps a list with all items and the function removeItem(). The controllers look as follows in details:
(function() {
    var app = angular.module('app', []);
    app.controller('PaneOneController', PaneOneController);
    app.controller('PaneTwoController', PaneTwoController);
    app.controller('MessageController', MessageController);

    /* first controller */
    function PaneOneController(EventEmitterListService) {
        var _self = this;
        this.item = null;

        this.addItem = function(item) {
            EventEmitterListService.emitAddItem(item);
            _self.item = null;
        }
    }

    /* second controller */
    function PaneTwoController($scope, EventEmitterListService) {
        var _self = this;
        this.list = [];

        this.removeItem = function(index) {
            var removed = _self.list.splice(index, 1);
            EventEmitterListService.emitRemoveItem(removed[0]);
        }

        EventEmitterListService.onAddItem('PaneTwo', function(item) {
            _self.list.push(item);
        });

        $scope.$on("$destroy", function() {
            EventEmitterListService.clear('PaneTwo');
        });
    }

    /* third controller */
    function MessageController($scope, $sce, EventEmitterListService) {
        var _self = this;
        this.message = null;

        EventEmitterListService.onAddItem('Message', function(item) {
            _self.message = $sce.trustAsHtml("<strong>" + item + "</strong> has been added successfully");
        });

        EventEmitterListService.onRemoveItem('Message', function(item) {
            _self.message = $sce.trustAsHtml("<strong>" + item + "</strong> has been removed successfully");
        });

        $scope.$on("$destroy", function() {
            EventEmitterListService.clear('Message');
        });
    }
})();
All three controllers communicate with a service called EventEmitterListService. The service exposes three methods:
  • emitAddItem - notifies listeners that are interested in adding an item to the list. The item is passed as parameter.
  • emitRemoveItem - notifies listeners that are interested in removing an item from the list. The item is passed as parameter.
  • onAddItem - registers a listener function that is interested in adding an item to the list. The listener is passed as parameter.
  • onRemoveItem - registers a listener function that is interested in removing an item from the list. The listener is passed as parameter.
  • clear - removes all registered listeners which belong to the specified controller. The controller is identified by the scope parameter (simple unique string).
The clear method is important when the $scope gets destroyed (e.g. when the DOM associated with the $scope gets removed due to ng-if or view switching). This method should be invoked on $destroy event - see code snippets with $scope.$on("$destroy", function() {...}). The full code of the service is listed below:
(function() {
    var app = angular.module('app');
    app.factory('EventEmitterListService', EventEmitterListService);

    function EventEmitterListService() {
        // Format of any object in the array:
        // {scope: ..., add: [...], remove: [...]}
        // "scope": some identifier, e.g. it can be the part of controller's name
        // "add": array of listeners for the given scope to be notified when an item is added
        // "remove": array of listeners for the given scope to be notified when an item is removed
        var listeners = [];

        function emitAddItem(item) {
            emitAction('add', item);
        }

        function onAddItem(scope, listener) {
            onAction('add', scope, listener);
        }

        function emitRemoveItem(item) {
            emitAction('remove', item);
        }

        function onRemoveItem(scope, listener) {
            onAction('remove', scope, listener);
        }

        function clear(scope) {
            var index = findIndex(scope);
            if (index > -1) {
                listeners.splice(index, 1);
            }
        }

        function emitAction(action, item) {
            listeners.forEach(function(obj) {
                obj[action].forEach(function(listener) {
                    listener(item);
                });
            });
        }

        function onAction(action, scope, listener) {
            var index = findIndex(scope);
            if (index > -1) {
                listeners[index][action].push(listener);
            } else {
                var obj = {
                    'scope': scope,
                    'add': action == 'add' ? [listener] : [],
                    'remove': action == 'remove' ? [listener] : []
                }
                listeners.push(obj);
            }
        }

        function findIndex(scope) {
            var index = -1;
            for (var i = 0; i < listeners.length; i++) {
                if (listeners[i].scope == scope) {
                    index = i;
                    break;
                }
            }

            return index;
        }

        var service = {
            emitAddItem: emitAddItem,
            onAddItem: onAddItem,
            emitRemoveItem: emitRemoveItem,
            onRemoveItem: onRemoveItem,
            clear: clear
        };

        return service;
    }
})();
That's all. Happy New Year!

Wednesday, December 23, 2015

Mock responses to HTTP calls with network traffic simulation by using ngMockE2E

The AngularJS' module ngMockE2E allows to fake HTTP backend implementation for unit testing and to respond with static or dynamic responses via the when API and its shortcuts (whenGET, whenPOST, etc). In this post, I will only demonstrate how to respond to any HTTP calls by mocking the responses. We will also see how the network load can be simulated. Such behavior gives a feeling of real remote calls. The implemented example is available on Plunker. It shows a small CRUD application to manage imaginary persons. There are four buttons for sending AJAX requests in a REST like manner. When a request is sent, you can see the text Loading... which disappears after 2 seconds when the response "is arrived". In fact, no request is sent of course. This is just a simulation with the AngularJS' module ngMockE2E. The network load takes exactly 2 seconds but you can set another delay value if you want


The first button sends a GET request to /persons to receives all available persons. If you click on this button, you will see three persons received from the imaginary backend.


The second button sends a GET request to /person/:id to receives one person by its ID. The ID is appended to the URI part /person/. If you type e.g. 2 and click on this button, you should see the following person


There is also a proper validation. If you type e.g. 4, no person will be found because no person with the ID 4 exists in "the backend".


If you type some wrong ID which is not a numeric value, an error message will be shown as well.


The third button sends a POST request to /person to create a new person or update existing one. The input fields name and birthday are required. You can type a name and birthday for a new person, sends the POST request and see the created person recived from "the backend".


Now, if you sends a GET request to /persons (the first button), you should see the created person in the list of all persons.


The last button deletes a person by sending the DELETE request to /person/:id. The deleted person is shown above the buttons.


Click on the first button again to ensure that the person was deleted and doesn't exist more.


Let's show the code. First of all you have to include the file angular-mocks.js after the file angular.js in order to overwrite the original $httpBackend functionality.
<script src="https://code.angularjs.org/1.4.8/angular.js"></script>
<script src="https://code.angularjs.org/1.4.8/angular-mocks.js"></script>
What is $httpBackend? In AngularJS, we normally use a high level $http or $resource service for HTTP calls. These services use for their part a low level service called $httpBackend. The $httpBackend has useful methods to create new backend definitions for various request types. There is a method when(method, url, [data], [headers], [keys]) and many shortcut methods such as
  • whenGET(url, [headers], [keys])
  • whenHEAD(url, [headers], [keys])
  • whenDELETE(url, [headers], [keys])
  • whenPOST(url, [data], [headers], [keys])
  • whenPUT(url, [data], [headers], [keys])
  • ...
The most interesting parameter is the url. This can be a String like /persons, a regular expression like /^\/person\/([0-9]+)$/ for /person/:id or a function that receives the url and returns true if the url matches the current definition. These methods return an object with respond and passThrough functions that control how a matched request is handled. The passThrough is useful when the backend REST API is ready to use and you want to pass mocked request to the real HTTP call. In this example, we only use the respond. The respond can take the object to be returned, e.g. $httpBackend.whenGET('/persons').respond(persons) or a function function(method, url, data, headers, params) which returns an array containing response status (number), response data (string), response headers (object), and the text for the status (string). The file app.js demonstrates how to use the $httpBackend.
(function() {
    var app = angular.module('app', ["ngMockE2E"]);
    app.run(HttpBackendMocker);

    // original list of persons
    var persons = [
      {id: 1, name: 'Max Mustermann', birthdate: '01.01.1970'},
      {id: 2, name: 'Sara Smidth', birthdate: '31.12.1982'},
      {id: 3, name: 'James Bond', birthdate: '05.05.1960'}
    ];

    // Reg. expression for /person/:id
    var regexPersonId = /^\/person\/([0-9]+)$/;

    function HttpBackendMocker($httpBackend) {
        // GET /persons
        $httpBackend.whenGET('/persons').respond(persons);

        // GET /person/:id
        $httpBackend.whenGET(regexPersonId).respond(function(method, url) {
            var id = url.match(regexPersonId)[1];
            var foundPerson = findPerson(id);

            return foundPerson ? [200, foundPerson] : [404, 'Person not found'];
        });

        // POST /person
        $httpBackend.whenPOST('/person').respond(function(method, url, data) {
            var newPerson = angular.fromJson(data);
            // does the person already exist?
            var existingPerson = findPerson(newPerson.id);

            if (existingPerson) {
                // update existing person
                angular.extend(existingPerson, newPerson);
                return [200, existingPerson];
            } else {
                // create a new person
                newPerson.id = persons.length > 0 ? persons[persons.length - 1].id + 1 : 1;
                persons.push(newPerson);
                return [200, newPerson];
            }
        });

        // DELETE: /person/:id
        $httpBackend.whenDELETE(regexPersonId).respond(function(method, url) {
            var id = url.match(regexPersonId)[1];
            var foundPerson = findPerson(id);

            if (foundPerson) {
                persons.splice(foundPerson.id - 1, 1);
                // re-set ids
                for (var i = 0; i < persons.length; i++) {
                    persons[i].id = i + 1;
                }
            }

            return foundPerson ? [200, foundPerson] : [404, 'Person not found'];
        });

        // helper function to find a person by id
        function findPerson(id) {
            var foundPerson = null;
            for (var i = 0; i < persons.length; i++) {
                var person = persons[i];
                if (person.id == id) {
                    foundPerson = person;
                    break;
                }
            }

            return foundPerson;
        }
    }
})();
Now we need a custom service that encapsulates the $http service. The service will get the name DataService. It is placed in the file service.js.
(function() {
    var app = angular.module('app');
    app.factory('DataService', DataService);

    function DataService($http) {
        var service = {
            getPersons: getPersons,
            getPerson: getPerson,
            addPerson: addPerson,
            removePerson: removePerson
        };

        return service;

        function getPersons() {
            return $http.get('/persons').then(
                function(response) {
                    return response.data;
                },
                function(error) {
                    // do something in failure case
                }
            );
        }

        function getPerson(id) {
            return $http.get('/person/' + id).then(
                function(response) {
                    return response.data;
                },
                function(error) {
                    if (error.status && error.status === 404) {
                        return error.data;
                    } else {
                        return "Unexpected request";
                    }
                }
            );
        }

        function addPerson(person) {
            return $http.post('/person', person).then(
                function(response) {
                    return response.data;
                },
                function(error) {
                    // do something in failure case
                }
            );
        }

        function removePerson(id) {
            return $http.delete('/person/' + id).then(
                function(response) {
                    return response.data;
                },
                function(error) {
                    if (error.status && error.status === 404) {
                        return error.data;
                    } else {
                        return "Unexpected request";
                    }
                }
            );
        }
    }
})();
The service DataService is invoked by a controller which I named DataController and placed in the controller.js.
(function() {
    var app = angular.module('app');
    app.controller('DataController', DataController);

    function DataController($scope, DataService) {
        var _self = this;
        this.persons = [];
        this.personId = null;
        this.person = {};
        this.message = null;
        this.loading = false;

        this.getPersons = function() {
            init();

            DataService.getPersons().then(function(data) {
                _self.persons = data;
                _self.loading = false;
            })
        }

        this.getPerson = function(id) {
            // check required input
            if ($scope.form.id4get.$error.required) {
                _self.message = "Please add person's id";
                return;
            }

            init();

            DataService.getPerson(id).then(function(data) {
                if (typeof data === "string") {
                    // error
                    _self.message = data;
                    _self.persons = null;
                } else {
                    _self.persons = [data];
                }

                _self.loading = false;
            })
        }

        this.addPerson = function(person) {
            // check required input
            if ($scope.form.name.$error.required) {
                _self.message = "Please add person's name";
                return;
            }
            if ($scope.form.birthdate.$error.required) {
                _self.message = "Please add person's birthdate";
                return;
            }

            init();

            DataService.addPerson(person).then(function(data) {
                _self.persons = [data];
                _self.loading = false;
            })
        }

        this.removePerson = function(id) {
            // check required input
            if ($scope.form.id4delete.$error.required) {
                _self.message = "Please add person's id";
                return;
            }

            init();

            DataService.removePerson(id).then(function(data) {
                if (typeof data === "string") {
                    // error
                    _self.message = data;
                    _self.persons = null;
                } else {
                    _self.persons = [data];
                }

                _self.loading = false;
            })
        }

        // helper function to reset internal state
        var init = function() {
            _self.persons = [];
            _self.message = null;
            _self.loading = true;
        }
    }
})();
Now we can use the controller in the view - the file index.html
<!DOCTYPE html>
<html>
<head>
    <meta charset="utf-8" />
    <meta content="IE=edge" http-equiv="X-UA-Compatible" />
    <script src="https://code.angularjs.org/1.4.8/angular.js"></script>
    <script src="https://code.angularjs.org/1.4.8/angular-mocks.js"></script>
</head>
<body ng-app="app">
    <div ng-controller="DataController as ctrl" ng-cloak>
        <ul style="padding-left:0; list-style:none;">
            <li ng-repeat="person in ctrl.persons">
                {{person.id}} - {{person.name}} - {{person.birthdate}}
            </li>
        </ul>

        <div style="margin-bottom:15px;" ng-show="ctrl.loading">Loading ...</div>
        <div style="color:red; margin-bottom:15px;" ng-show="ctrl.message">{{ctrl.message}}</div>

        <form name="form" novalidate>
            <button ng-click="ctrl.getPersons()">GET /persons</button>
            <p></p>
            <button ng-click="ctrl.getPerson(ctrl.personId)">GET /person/:id</button>
            <input ng-model="ctrl.personId" name="id4get" required placeholder="id" />
            <p></p>
            <button ng-click="ctrl.addPerson(ctrl.person)">POST /person</button>
            <input ng-model="ctrl.person.name" name="name" required placeholder="name" />
            <input ng-model="ctrl.person.birthdate" name="birthdate" required placeholder="birthdate" />
            <p></p>
            <button ng-click="ctrl.removePerson(ctrl.personId)">DELETE /person/:id</button>
            <input ng-model="ctrl.personId" name="id4delete" required placeholder="id" />
        </form>
    </div>

    <script src="app.js"></script>
    <script src="config.js"></script>
    <script src="service.js"></script>
    <script src="controller.js"></script>
</body>
</html>
The current implementation has one shortcoming. The text Loading ... is not shown at all because the response is delivered very quickly, so that user doesn't see it. It would be nice to delay HTTP calls in order to simulate the network load. For this purpose we can use the $provide service and register a service decorator for the $httpBackend. The service decorator acts as proxy. If you look into the source code of the $httpBackend, you can see that the $httpBackend is created by the constructor function(method, url, data, callback, headers, timeout, withCredentials). The four parameter callback is responsible for the response. We have to provide our own implementation which I named delayedCallback. The delayedCallback function invokes the original callback with a delay (here 2 seconds) by means of setTimeout(). We delegate the proxy call to the $httpBackend instantiation, but with the new delayed callback. The injected $delegate object points exactly to the $httpBackend. The full code is shown below.
(function() {
    var app = angular.module('app');
    app.config(HttpBackendConfigurator);

    function HttpBackendConfigurator($provide) {
        $provide.decorator('$httpBackend', HttpBackendDecorator);

        function HttpBackendDecorator($delegate) {
            var proxy = function(method, url, data, callback, headers, timeout, withCredentials) {
                // create proxy for callback parameter
                var delayedCallback = function() {
                    // simulate network load with 2 sec. delay
                    var delay = 2000;

                    // Invoke callback with delaying
                    setTimeout((function() {
                        callback.apply(this, arguments[0]);
                    }.bind(this, arguments)), delay);
                };

                // delegate to the original $httpBackend call, but with the new delayed callback
                $delegate(method, url, data, delayedCallback, headers, timeout, withCredentials);
            };

            // the proxy object should get all properties from the original $httpBackend object 
            angular.extend(proxy, $delegate);

            // return proxy object
            return proxy;
        }
    }
})();

Tuesday, October 27, 2015

PrimeFaces Extensions 4.0.0 released

Dear Community,

A new 4.0.0 version of PrimeFaces Extensions has been released. Artefacts are available in the Maven central.

Release notes there are as usually on the project's wiki page.

This version is fully compatible with the latest PrimeFaces 5.3.

Enjoy.

Monday, September 21, 2015

Promises in AngularJS. Part I. Basics.

There are many blog posts about Promises in general and Promises in AngularJS in particular. Promises are a part of the ES6 (EcmaScript 6) specification, so it is worth to learn them. In this blog post, I will try to summarize the most important info about Promises in a simple and clear way. The necessary code snippets will be provided too. The first part is about basics.

What is a Promise? Promise is a solution to avoid so-called "Pyramid of Doom". If you work with asynchronous operations (HTTP calls, file system operations, etc.), you know the situation where you have callbacks nested inside of another callbacks which are nested again inside of callbacks and so on. An example:
asyncOperationOne('first data', function(firstResult) {
    asyncOperationTwo('second data', function(secondResult) {
        asyncOperationThree('third data', function(thirdResult) {
            asyncOperationFour('fourth data', function(fourthResult) {
                doSomething();
            });
        });
    });
});
Every asynchronous operation invokes a callback function when the operation is completely done. In case of AngularJS and the $http service the "Pyramid of Doom" could look as follows:
var result = [];

$http.get('/api/data/first.json').success(function(data) {
    result.push(data);
    $http.get('/api/data/second.json').success(function(data) {
        result.push(data);
        $http.get('/api/data/third.json').success(function(data) {
            result.push(data);
            $scope.result = result;
        });
    });
});
We didn't show yet the error handling. Now imagine how it looks like with error handling. Terrible and not readable. Promises help to synchronize multiple asynchronous functions and avoid Javascript callback hell. The official concept of Promises is described in the Promises/A+ specification. With promises, an asynchronous call returns a special object called Promise. The calling code can then wait until that promise is fulfilled before executing the next step. To do so, the promise has a method named then, which accepts two functions - success and error function. The success function will be invoked when the promise has been fulfilled. The error function will be invoked when the promise has been rejected. An example of $http service based on promises:
$http.get('/api/data.json').then(
    function(response) {
        ...
    },
    function(error) {
        ...
    }
);
We can say, a promise represents the future result of an asynchronous operation. The real power from promises is the chaining. You can chain promises. The then function returns a new promise, known as derived promise. The return value of some promise's callback is passed as parameter to the callback of the next promise. In this way, you can access previous promise results in a .then() chain. An error is passed too, so that you can define just one error callback at the end of chain and handle all errors there. Here is an example of promise chaining:
$http.get('/api/data.json').then(
    function(response) {
        return doSomething1(response);
    })
    .then(function(response) {
        return doSomething2(response);
    })
    .then(function(response) {
        return doSomething3(response);
    }, function(error) {
        console.log('An error occurred!', error);
    }
);
If you return some value from the derived promise as shown above, the derived promise will be resolved immediately. A derived promise can be deferred as well. That means, the resolution / rejection of the derived promise can be deferred until the previous promise has been resolved / rejected. This is done by returning a promise from the success or error callback. The AngularJS' $http service returns a promise, so that we can rewrite the shown above "Pyramid of Doom" with $http as
$http.get('/api/data/first.json').then(
    function(response) {
        // response here comes from the GET to /api/data/first.json
        result.push(response.data);
        return $http.get('/api/data/second.json');
    })
    .then(function(response) {
        // response here comes from the GET to /api/data/second.json
        result.push(response.data);
        return $http.get('/api/data/third.json');
    })
    .then(function(response) {
        // response here comes from the GET to /api/data/third.json
        result.push(response.data);
        $scope.result = result;
    }, function(error) {
        console.log('An error occurred!', error);
    }
);
What happens now when a HTTP call in the promise chain fails? In this case, success callbacks in every subsequent promise in the chain will be skipped (not invoked). The next error callback in the promise chain however will be invoked. Therefore, the error callback is indispensable. An example:
$http.get('/api/data/wrong_or_failed_url.json').then(
    function(response) {
        // this function is not invoked
        return $http.get('/api/data/second.json');
    })
    .then(function(response) {
        // this function is not invoked
        return $http.get('/api/data/third.json');
    })
    .then(function(response) {
        // this function is not invoked
        $scope.result = response.data;
    }, function(error) {
        // error callback gets invoked
        console.log('An error occurred!', error);
    }
);
Instead of error callback you can also use an equivalent - promise's function catch. The construct promise.catch(errorCallback) is a shorthand for promise.then(null, errorCallback).

Be aware that either the success or the error callback will be invoked, but never both. What to do if you need to ensure a specific function always executes regardless of the result of the promise? You can do this by registering that function on the promise using the finally() method. In this method, you can reset some state or clear some resources. An example:
$http.get('/api/data.json').then(
    function(response) {
        // Do something in success case
    },
    function(error) {
        // Do something in failure case
    }).finally(function() {
        // Do something after either success or failure
    }
);
Where and how to use the promise examples above? A typical pattern in AngularJS is to have calls via $http in a service. Controllers call services and are not aware that $http is used. Example for MyController -> MyService -> $http:
// In MyService
this.fetchData = function() {
    return $http.get('/api/data.json');
};

// In MyController
$scope.fetchData = function() {
    MyService.fetchData().then(function(response) {
        return response.data;
    }, function(error) {
        ...
    }).finally(function() {
        ...
    })
};
We have to mention yet that AngularJS' implementation of promises is a subset of the library Q. Q is the most known implementation of the Promises/A+ specification. That's all for this short post. In the next post, we will meet the AngularJS' deferred object. A deferred object is an object that exposes a promise as well as the associated methods for resolving / rejecting that promise. It is constructed using the $q service - the AngularJS' implementation of promises / deferred objects inspired by Q.

Stay tuned!

Thursday, July 30, 2015

Magic $parse service in AngularJS

AngularJS has a useful, but less documented $parse service which is shortly described in the Angular's docu as "converts Angular expression into a function". In this post, I will explain what this description means and show explicit and implicit usages of $parse. Furthermore, we will see the differences to the Angular's $interpolate service and $eval method.

Explicitly used $parse

The $parse compiles an expression to a function which can be then invoked with a context and locals in order to retrieve the expression's value. We can imagine this function as a pre-compiled expression. The first parameter is context - this is an object any expressions embedded in the strings are evaluated against (typically a scope object). The second parameter is locals - this is an JavaScript object with local variables, useful for overriding values in context. I've created a simple plunker to demonstrate the simple usage.
<!DOCTYPE html>
<html>
  <head>
    <meta charset="utf-8" />
    <meta http-equiv="X-UA-Compatible" content="IE=edge" />
  </head>
  <body ng-app="app">
    <h1>Magical $parse service</h1>
    
    <div ng-controller="ParseController as vm">{{vm.parsedMsg}}</div>
    
    <script src="https://code.angularjs.org/1.4.3/angular.js"></script>
    <script src="app.js"></script>
  </body>
</html>
In the HTML we output the controller's variable parsedMsg which is created in the app.js as follows:
(function() {
  angular.module('app', []).controller('ParseController', ParseController);

  function ParseController($scope, $parse) {
    this.libs = {};
    this.libs.angular = {
      version: '1.4.3'
    };

    var template = $parse("'This example uses AngularJS ' + libs.angular.version");
    this.parsedMsg = template(this);
  }
})();
As you can see, the expression 'This example uses AngularJS ' + libs.angular.version gets parsed and assigned to the variable template. The template is a function which is invoked with this object as context. The this object containes the value of libs.angular.version (nested objects). This value is used when evaluating the pre-compiled expression. The result is saved in this.parsedMsg and shown in the HTML.

It is important to understand that $parse can be invoked once and its result (pre-compiled expression) can be used multiple times with different context (and locals). The next plunker demonstrates the usage of just one pre-compiled expression with different locales. The HTML part:
<!DOCTYPE html>
<html>
  <head>
    <meta charset="utf-8" />
    <meta http-equiv="X-UA-Compatible" content="IE=edge" />
  </head>
  <body ng-app="app">
    <h1>Magical $parse service</h1>
    
    <div ng-controller="ParseController as vm" ng-bind-html="vm.parsedMsg"></div>
    
    <script src="https://code.angularjs.org/1.4.3/angular.js"></script>
    <script src="https://code.angularjs.org/1.4.3/angular-sanitize.js"></script>
    <script src="app.js"></script>
  </body>
</html>
The JavaScript part:
(function() {
  angular.module('app', ["ngSanitize"]).controller('ParseController', ParseController);

  function ParseController($scope, $parse) {
    this.libs = ['angular.js', 'angular-sanitize.js', 'bootstrap.css'];

    var template = $parse("libs[i]")

    var output = '';
    for (var i = 0; i < this.libs.length; i++) {
      output = output + '<li>' + template(this, {i: i}) + '</li>';
    }

    this.parsedMsg = 'The project uses <ul>' + output + '</ul>';
  }
})();
As you can see we pass {i: i} repeatedly as the second parameter, so that the expression libs[i] is always evaluated with the current value i. That means libs[0], libs[1], etc. The result looks like as follows:

There are some cool things you can do with $parse. I would like to only list three of them.

1) If the expression is assignable, the returned function (pre-compiled expression) will have an assign property. The assign property is a function that can be used to change the expression value on the given context, e.g. $scope. Example:
 
$parse('name').assign($scope, 'name2');
 
In the example, the value name on the $scope has been changed to name2.

2) If we have an object with properties that might be null, no JS errors will be thrown when using $parse. Instead of that, the result is set to undefined. In the example above, we had the property libs.angular.version. The result of
 
var version = $parse('libs.angular.version')(this);
 
is version = 1.4.3. If we would use a non exisiting property in the expression, e.g.
 
var version = $parse('libs.ember.version')(this);
 
the result would be version = undefined. AngularJS doesn't throw an exception.

3) Theoretically you can send any logic from backend to the client as string and evaluate the parsed string on the client. Example:
var backendString = 'sub(add(a, 1), b)';

var math = {
  add: function(a, b) {return a + b;},
  sub: function(a, b) {return a - b;}
};

var data = {
  a: 5,
  b: 2
};

var result = $parse(backendString)(math, data); // 4

Implicitly used $parse

AngularJS also uses $parse internally. The next example was taken from a book and slightly modified by me. It demonstrates a color picker with four sliders (three for colors and one for opacity).

The whole code: the HTML page, the color-picker directive and app.js are demonstrated in my last plunker. The page uses the color-picker directive:
<!DOCTYPE html>
<html>
  <head>
    <meta charset="utf-8" />
    <meta content="IE=edge" http-equiv="X-UA-Compatible" />
  </head>
  <body ng-app="app">
    <h1>Magical $parse service</h1>
    
    <div ng-controller="MainController as vm">
      <color-picker init-r="255"
                    init-g="0"
                    init-b="0"
                    init-a="1.0"
                    on-change="vm.onColorChange(r,g,b,a)">
      </color-picker>
    </div>
    
    <script src="https://code.angularjs.org/1.4.3/angular.js"></script>
    <script src="app.js"></script>
  </body>
</html>
The directive's template colorPickerTemplate.html renders four input sliders, a live preview and current colors and opacity.
R:<input type="range"
         min="0" max="255" step="1"
         ng-model="r">
         <br>
G:<input type="range"
         min="0" max="255" step="1"
         ng-model="g">
         <br>
B:<input type="range"
         min="0" max="255" step="1"
         ng-model="b">
         <br>
A:<input type="range"
         min="0" max="1" step="0.01"
         ng-model="a">

<div style="width: 300px; height: 100px; margin-top:10px;
            background-color: rgba({{ r }}, {{ g }}, {{ b }}, {{ a }});">
</div>
{{ r }}, {{ g }}, {{ b }}, {{ a }}
The JavaScript part links all together. Whenever a color or opacity is changed, the Angular's $log service logs current colors and opacity in the browser dev. console. The current colors and opacity are also shown permanently as mentioned above.
(function() {
  angular.module('app', [])
    .controller('MainController', MainController)
    .directive('colorPicker', colorPickerDirective);

  function MainController($log) {
    this.onColorChange = function(r, g, b, a) {
      $log.log('onColorChange', r, g, b, a);
    };
  }

  function colorPickerDirective() {
    return {
        scope: {
          r:        '@initR',
          g:        '@initG',
          b:        '@initB',
          a:        '@initA',
          onChange: '&'
        },
        restrict: 'E',
        templateUrl: 'colorPickerTemplate.html',
        link: function(scope) {
          ['r', 'g', 'b', 'a'].forEach(function(value) {
              scope.$watch(value, function(newValue, oldValue) {
                  if (newValue !== oldValue) {
                      if (angular.isFunction(scope.onChange)) {
                          scope.onChange({
                            'r': scope.r,
                            'g': scope.g,
                            'b': scope.b,
                            'a': scope.a
                          });
                      }
                  }
              });
          });
        }
    };
  }
})();
But stop, where is the magic $parse here? The directive defines a parameter for an onchange callback via onChange: '&'. Behind the scenes, AngularJS parses the passed onchange string, in this case vm.onColorChange(r,g,b,a), and creates a function (discussed pre-compiled expression) which can be invoked with the desired context and locales. In the directive, we invoke this function with current values for colors r, g, b and opacity a.
scope.onChange({
    'r': scope.r,
    'g': scope.g,
    'b': scope.b,
    'a': scope.a
});

Relation to $interpolate and $eval

$eval is a method on a scope. It executes an expression on the current scope and returns the result. Example:
scope.a = 1;
scope.b = 2;
 
scope.$eval('a+b'); // result is 3
Internally, it uses $parse for their part.
$eval: function(expr, locals) {
    return $parse(expr)(this, locals);
}
Note, that the $eval method can not precompile expressions and be used repeatedly. The $parse, in contrast, can be invoked once in one place and its result can then be used repeatedly elsewhere. We have seen that in the second plunker. Here is another simple example to emphasize the difference to $eval:
var parsedExpression = $parse(...);

var scope1 = $scope.$new();
var scope2 = $scope.$new();

...

var result1 = parsedExpression(scope1);
var result2 = parsedExpression(scope2);
There is another high level service - $interpolate. By means of $interpolate you can do on-the-fly templating like well-known template engines Handlebars or Hogan do that. ES6 has similar template strings too. But take a look at one $interpolate example.
(function() {
  angular.module('app', []).controller('InterpolateController', InterpolateController);

  function InterpolateController($scope, $interpolate) {
    var template = $interpolate('Book {{title}} is published by {{publisher | uppercase}}');

    var bookData1 = {title: 'PrimeFaces Cookbook', publisher: 'Packt'};
    var bookData2 = {title: 'Spring in Action', publisher: 'Manning'};

    // execute the function "template" and pass it data objects to force interpolation
    var bookInfo1 = template(bookData1);
    var bookInfo2 = template(bookData2);

    // the result:
    // bookInfo1 = "PrimeFaces Cookbook is published by PACKT"
    // bookInfo1 = "Spring in Action is published by MANNING"
  }
})();
As you can see, the $interpolate service takes a string and returns a function which is a compiled template. Expressions in the passed string may have all allowable Angular's expression syntax. E.g. a pipe with filter as in the example {{publisher | uppercase}}. The function can be called again and again with different data objects. The $interpolate service uses $parse internally to evaluate individual expressions against the scope. In this regard, the $interpolate service is normally used as a string-based template language for strings containing multiple expressions. In contrast to $parse, the $interpolate is read only.

Sunday, June 14, 2015

Create source maps at project build time for JavaScript debugging

If you're doing front end web development, your web resources such as JavaScript and CSS files might be minificated, transpiled or compiled from a completely different language. If you now want to debug the generated code in the browser, you can not do that because the output code is obfuscated from the code you wrote. The solution is to use source maps. A good introduction to the source map can be found in the articles "Introduction to JavaScript Source Maps", "Enhance Your JavaScript Debugging with Cross-Browser Source Maps" and "An Introduction to Source Maps".

I will try to summarize the quintessence about the source maps and concentrate on JavaScript only. Generating CSS by LESS / SASS is interesting too, but it is not yet supported by my Maven plugin I will present in this blog post. How to use a source map? In order to use a source map, your browser development tools need to be able to find it. If your generated code is JavaScript, one way to let the development tools know where to look is to add a comment to the end of the generated code which defines the sourceMappingURL - the location of the source map. For example: //# sourceMappingURL=mylib.js.map or //# sourceMappingURL=/mypath/mylib.js.map or //# sourceMappingURL=http://sourcemaps/mylib.js.map. If you now open a development tool and the source map support is enabled, the browser will stream down the source map which points to the original file and show the original file instead of generated one (minificated, compiled, transpiled, etc.). Now, you can set a breakpoint in the original file and debug it as it would be delivered with you web application! Such debugging is possible due to the fact that a source map provides a way of mapping code within a generated file back to it's original position in a source file. I found a nice bild here to visualize the entire process.


Source maps are supported in all major browser that are shipped with built-in support for source maps. If you follow my links to the articles I mentioned above, you can see some examples for Internet Expoler 11, Firefox and Google Chrome. Chrome's Dev Tools enables the source maps support by default. Simple check if the checkbox "Enable JavaScript source maps" is enabled.


If you a fan of Firefox, you should use the Firefox' Web Console (Shift + Ctrl + K) and check if the same option is enabled in settings too. Please note that Firebug doesn't support debugging by source maps. You have to use the native Firefox' Web Console as I said. Internet Explorer 11 also rocks with source maps. It has even more features - you can add source map files from your local disk. Simple right-click on a generated (e.g. compressed) file and select "Choose source map". The article "Enhance Your JavaScript Debugging with Cross-Browser Source Maps" which I mentioned above, has a lot of picture for IE and debugging TypeScript files.

Source maps can be created during your build. If you work with Node.js and npm, the best tool is UglifyJS - a popular command line utility that allows you to combine and compress JavaScript files. If you work with Maven, you can use my Maven plugin for resource optimization I wrote a some years ago and just updated for supporting source maps. It uses the superb Google Closure Compiler which offers a support for source map creation and many other cool features. Please refer the documentation of the Maven plugin to see how to set the various source map configuration options. The simplest configuration which we also use in the current PrimeFaces Extensions release looks like as follows:
<plugin>
    <groupId>org.primefaces.extensions</groupId>
    <artifactId>resources-optimizer-maven-plugin</artifactId>
    <version>2.0.0</version>
    <configuration>
        <sourceMap>
            <create>true</create>
            <outputDir>${project.basedir}/src/sourcemap/${project.version}</outputDir>
            <sourceMapRoot>
              https://raw.githubusercontent.com/primefaces-extensions/core/master/src/sourcemap/${project.version}/
            </sourceMapRoot>
        </sourceMap>
    </configuration>
    <executions>
        <execution>
            <id>optimize</id>
            <phase>prepare-package</phase>
            <goals>
                <goal>optimize</goal>
            </goals>
        </execution>
    </executions>
</plugin>
By this way, only compressed JavaScript files will be packaged within released JAR file. Uncompressed files are not within the JAR. The JAR is smaller and free from redundant stuff. For the PrimeFaces Extensions, the source map and uncompressed files are checked in below the project root. For example, the folder with source maps and original files for the current release 3.2.0 is located here: https://github.com/primefaces-extensions/core/tree/master/src/sourcemap/3.2.0 That means, a compressed file, say timeline.js, has the following line at the end:

//# sourceMappingURL=https://raw.githubusercontent.com/primefaces-extensions/core/master/src/sourcemap/3.2.0/timeline.js.map

The source map timeline.js.map has the content (I truncated some long lines):
{
  "version":3,
  "file":"timeline.js",
  "lineCount":238,
  "mappings":"A;;;;;;;;;;;;;;;;;;;;AA4DqB,WAArB,GAAI,MAAOA,MAAX,GACIA,KADJ,CACY,EADZ,CAUsB,YAAtB,GAAI,...
  "sources":["timeline.source.js"],
  "names":["links","google","undefined","Array","prototype","indexOf","Array.prototype.indexOf",...]
}
It would be probably better to bring these files to a CDN, like cdnjs, but CDNs are not really intended for source maps. Another option would be a PrimeFaces repository. We will see how to handle that in the future. The next picture demonstrates how the debugging for the compressed JS file timeline.js looks in my Firefox.


Due to the Firefox built-in source map support we see the uncompressed JS file timeline.source.js. I set a breakpoint on the line with the if-statement if (index < 0). After that I clicked on an event in the Timeline component. The breakpoint was jumped up. Now, I can debug step by step and see my local and global variables on the right side. As you probably see, the variable index is shown as a (I marked it red as var index). This is a shortcoming of the current source map specification. You can read this discussion for more details.

Again, keep in mind - the source maps and original files will be only loaded when you open up the browser dev. tools and enable this support explicitly. If the dev. tools has identified that a source map is available, it will be fetched along with referenced source file(s). If a source map file is not available, you will see a 404 "not found message" in the dev. tool (not bad at all in my opinion).

That's all. My next posts will be about AngularJS and less about JSF. Stay tuned.

Wednesday, June 10, 2015

PrimeFaces Extensions 3.2.0 released

Dear PrimeFaces community,

PrimeFaces Extensions 3.2.0 has been released! This is a maintenance release which is built on top of PrimeFaces 5.2. Closed issues are available on GitHub.

Some notes to the two important changes:
  • pe:ajaxErrorHandler was removed in favor of p:ajaxExceptionHandler. It was buggy and not working in the prev. release.
  • Uncompressed JS files are not delivered anymore within JAR files. The compressed JS files contain //# sourceMappingURL=... which points to the appropriate source maps and uncompressed files for debug purpose. The source maps and uncompressed files are checked in direct in the GitHub and can be fetched from there. That means, uncompressed files have something like at the end:
    //# sourceMappingURL=https://raw.githubusercontent.com/primefaces-extensions/core/master/src/sourcemap/3.2.0/primefaces-extensions.js.map

    More info about source map is coming soon in my blog and the Wiki page of the Maven plugin for resource optimization.
The deployed showcase will be available soon as usually at http://primefaces.org/showcase-ext/views/home.jsf.

Have fun!

Saturday, May 30, 2015

PrimeFaces Cookbook Second Edition has been published

PrimeFaces Cookbook Second Edition was published today. This is an updated second edition of the first PrimeFaces book ever published. PrimeFaces Cookbook Second Edition covers over 100 effective recipes for PrimeFaces 5.x which this leading component suite offers you to boost JSF applications - from AJAX basics, theming, i18n support and input components to advanced usage of datatable, menus, drag-&-drop, charts, client-side validation, dialog framework, exception handling, responsive layout, and more.


I have updated the book's homepage with a new table of contents. There are 11 chapters and more than 380 pages. You can download the book's code, clone the project on GitHub, compile and run it. Please follow the instructions on the GitHub.

I would like to thank my family, especially my wife, Veronika; our advisers from Packt Publishing, Llewellyn Rozario and Ajinkya Paranjape, our reviewers, the PrimeFaces project lead Ƈağatay Ƈivici and JSF specification lead Ed Burns. These people accompanied us during the entire writing process and made the publication of the book possible with their support, suggestions, and reviews. Thanks a lot!

Wednesday, April 22, 2015

PrimeFaces Extensions 3.1.0 released

Today, we released the PrimeFaces Extensions 3.1.0. It is built on top of PrimeFaces 5.2 and is fully compatible with PrimeFaces 5.2.

Closed issues are available on the GitHub. Please consider some enhancements in the existing components like Timeline, DynaForm, InputNumber and CKEditor.

The new deployed showcase will be available soon as usually here. The next release will be a maintenance release again.

Have fun!

Sunday, April 5, 2015

A way to read properties with variable interpolation

Recently, I tried to define and read a global properties in an application server. The benefit of such property configured in the application server - it can be shared across all web applications that are deployed on this server. Every deployed application can read the same property which is configured just once at one place. What I tried to do was a system property with another system property in the value part. In the application server JBoss / WildFly, you can e.g. define a system property in the configuration file standalone.xml. I set the property exporting.service.config.file.
<system-properties>
    <property name="exporting.service.config.file" value="${jboss.server.config.dir}\exporting\exporting-service.properties"/>
</system-properties>
jboss.server.config.dir points to the base configuration directory in JBoss. This property is set automatically by JBoss. In this example, we have a so-called Variable Interpolation. The definition from the Wikipedia: "Variable interpolation (also variable substitution or variable expansion) is the process of evaluating a string literal containing one or more placeholders, yielding a result in which the placeholders are replaced with their corresponding values". Another example for placeholders ${...} in property value would be the following configuration:
application.name=My App
application.version=2.0
application.title=${application.name} ${application.version}
When we now try to get the system property from the first example with Java's System.getProperty(...)
 
String globalConfigFile = System.getProperty("exporting.service.config.file");
 
we will get the value ${jboss.server.config.dir}\exporting\exporting-service.properties. The placeholder ${jboss.server.config.dir} is not resolved. There are the same troubles in the second example as well.

What would be the simplest way to read properties with variable interpolation? Well, there is the Spring Framework with PlaceholderConfigurerSupport and so on. But it is an overhead to have such big framework as dependency. Is there a lightweight library? Yes, sure - Apache Commons Configuration. Apache Commons Configuration provides special prefix names for properties to evaluate them in a certain context. There are for instance:
  • sys: This prefix marks a variable to be a system property. Commons Configuration will search for a system property with the given name and replace the variable by its value.
  • const: The prefix indicates that a variable is to be interpreted as a constant member field of a class. The name of the variable must be fully qualified class name.
  • env: The prefix references OS-specific environment properties.
 Some examples from the documentation:
user.file = ${sys:user.home}/settings.xml
action.key = ${const:java.awt.event.KeyEvent.VK_CANCEL}
java.home = ${env:JAVA_HOME}
Now, I could add the needed dependency to my Maven project
<dependency>
    <groupId>commons-configuration</groupId>
    <artifactId>commons-configuration</artifactId>
    <version>1.10</version>
</dependency>
set the prefix sys: before jboss.server.config.dir
<system-properties>
    <property name="exporting.service.config.file" value="${sys:jboss.server.config.dir}\exporting\exporting-service.properties"/>
</system-properties>
and write the following code
import org.apache.commons.configuration.SystemConfiguration;

...

SystemConfiguration systemConfiguration = new SystemConfiguration();
String globalConfigFile = systemConfiguration.getString("exporting.service.config.file");
...
The String globalConfigFile on my notebook has the value C:\Development\Servers\jboss-as-7.1.1.Final\standalone\configuration\exporting\exporting-service.properties. The prefix sys: marks a variable to be a system property. Commons Configuration will search for a system property with the given name and replace the variable by its value. The complete code:
import org.apache.commons.configuration.ConfigurationException;
import org.apache.commons.configuration.PropertiesConfiguration;
import org.apache.commons.configuration.SystemConfiguration;

...

PropertiesConfiguration propertiesConfiguration = new PropertiesConfiguration();
SystemConfiguration systemConfiguration = new SystemConfiguration();
String globalConfigFile = systemConfiguration.getString("exporting.service.config.file");
if (globalConfigFile != null) {
    try {                
        propertiesConfiguration.setDelimiterParsingDisabled(true);                
        propertiesConfiguration.load(globalConfigFile);
    } catch (ConfigurationException e) {
        LOG.log(Level.INFO, "Cannot read global properties");
    }            
}
Any single property can be read e.g. as
propertiesConfiguration.getString("someKey")
propertiesConfiguration.getString("someKey", someDefaultValue)
propertiesConfiguration.getBoolean("someKey")
propertiesConfiguration.getBoolean("someKey", someDefaultValue)
propertiesConfiguration.getInteger("someKey")
propertiesConfiguration.getInteger("someKey", someDefaultValue)
usw. That's all. Let me know if you know another simple ways to read properties with variable interpolation.

Friday, April 3, 2015

Caching of web content with Spring's cache manager


I this post, I would like to show basics how to cache and manage the caching of web content with Spring's CacheManager, @Cacheable and JMX annotations. Imagine a web shop which fetches some content, such as header, footer, teasers, main navigation, from a remote WCMS (Web Content Management System). The fetching may e.g. happen via a REST service. Some content is rarely updated, so that it makes sense to cache it in the web application due to performance reasons.

Getting Started

First, we need a cache provider. A good cache provider would be EhCache. You need to add the EhCache as dependency to your project. You also need to configure ehcache.xml which describes, among other things, the cache name(s), where and how long the cached content is stored. Please refer to the documentation to learn how the ehcache.xml looks like. The central class of the EhCache is the net.sf.ehcache.CacheManager. With help of this class you can add or remove any objects to / from the cache and much more programmatically. Objects can be cached in memory, on the disk or somewhere else.

The Spring framework provides a CacheManager backed by the EhCache - org.springframework.cache.CacheManager. It also provides the @Cacheable annotation. From the documentation: "As the name implies, @Cacheable is used to demarcate methods that are cacheable - that is, methods for whom the result is stored into the cache so on subsequent invocations (with the same arguments), the value in the cache is returned without having to actually execute the method. In its simplest form, the annotation declaration requires the name of the cache associated with the annotated method". We will use the JMX annotations as well. These are Spring's annotations @ManagedResource and @ManagedOperation. Why do we need those? We need them to be able to clear cache(s) via an JMX console. Why? Well, e.g. the underlying data have been changed, but the cache is not expired yet. The outdated data will be still read from the cache and not from the native source. The beans annotated with @ManagedResource will be exposed as JMX beans and methods annotated by @ManagedOperation can be executed via an JMX console. I recommend to use JMiniX as a simple JMX entry point. Embedding JMiniX in a webapp is done simply by declaring a servlet. Parametrized methods are supported as well, so that you can even input some real values for method's parameters and trigger the execution with these values.

How to do it...

Now we are ready to develop the first code. We need a service which communicates with a remote backend in order to fetch various contents from the WCMS. Let's show exemplary a basic code with one method fetchMainNavigation(). This method fetches the structure of the main navigation menu and converts the structure to a DTO object NavigationContainerDTO (model class for the menu). The whole business and technical logic is resided in the bean MainNavigationHandler. This logic is not important for this blog post. The method fetchMainNavigation() expects two parameters: locale (e.g. English or German) and variant (e.g. B2C or B2B shop).
@Component
public class WCMSServiceImpl extends BaseService implements WCMSService {
 
    // injection of Spring's CacheManager is needed for @Cacheable
    @Autowired
    private CacheManager cacheManager;
 
    @Autowired
    private MainNavigationHandler mainNavigationHandler;
 
    ...
 
    @Override
    @Cacheable(value = "wcms-mainnavigation",
                        key = "T(somepackage.wcms.WCMSBaseHandler).cacheKey(#root.methodName, #root.args[0], #root.args[1])")
    public NavigationContainerDTO fetchMainNavigation(Locale lang, String variant) {
        Object[] params = new Object[0];
        if (lang != null) {
            params = ArrayUtils.add(params, lang);
        }
        if (variant != null) {
            params = ArrayUtils.add(params, variant);
        }
 
        return mainNavigationHandler.get("fetchMainNavigation", params);
    }
}
The method is annotated with the Spring's annotation @Cacheable. That means, the returned object NavigationContainerDTO will be cached if it was not yet available in the cache. The next fetching will return the object from the cache until the cache gets expired. The caching occurs according to the settings in the ehcache.xml. Spring's CacheManager finds the EhCache provider automatically in the classpath. The value attribute in @Cacheable points to the cache name. The key attribute points to the key in the cache the object can be accessed by. Since caches are essentially key-value stores, each invocation of a cached method needs to be translated into a suitable key for the cache access. In a simple case, the key can be any static string. In the example, we need a dynamic key because the method has two parameters: locale and variant. Fortunately, Spring supports dynamic keys with SpEL expression (Spring EL expression). See the table "Cache SpEL available metadata" for more details. You can invoke any static method generating the key. Our expression T(somepackage.wcms.WCMSBaseHandler).cacheKey(#root.methodName, #root.args[0], #root.args[1]) means we call the static method cacheKey in the class WCMSBaseHandler with three parameters: the method name, first and second arguments (locale and variant respectively). This is our key generator.
public static String cacheKey(String method, Object... params) {
    StringBuilder sb = new StringBuilder();
    sb.append(method);

    if (params != null && params.length > 0) {
        for (Object param : params) {
            if (param != null) {
                sb.append("-");
                sb.append(param.toString());
            }
        }
    }

    return sb.toString();
}
Let's show how the handler class MainNavigationHandler looks like. This is just a simplified example from a real project.
@Component
@ManagedResource(objectName = "bean:name=WCMS-MainNavigation",
                                description = "Manages WCMS-Cache for the Main-Navigation")
public class MainNavigationHandler extends WCMSBaseHandler<NavigationContainerDTO, Navigation> {

    @Override
    NavigationContainerDTO retrieve(Objects... params) {
        // the logic for content retrieving and DTOs mapping is placed here
        ...
    }
 
    @ManagedOperation(description = "Delete WCMS-Cache")
    public void clearCache() {
        Cache cache = cacheManager.getCache("wcms-mainnavigation");
        if (cache != null) {
            cache.clear();
        }
    } 
}
The CacheManager is also available here thanks to the following injection in the WCMSBaseHandler.
@Autowired
private CacheManager cacheManager;
@ManagedResource is the Spring's JMX annotation, so that the beans are exported as JMX MBean and become visible in the JMX console. The method to be exported should be annotated with @ManagedOperation. This is the methode clearCache() which removes all content for the main navigation from the cache. "All content" means an object of type NavigationContainerDTO. The developed WCMS service can be now injected into a bean on the front-end side. I already blogged about how to build a multi-level menu with plain HTML and shown the code. This is exactly the main navigation from this service.

There is more...

The scanning of JMX annotations should be configured in a Spring's XML configuration file.
<bean id="exporter" class="org.springframework.jmx.export.MBeanExporter">
    <property name="server" ref="mbeanServer"/>
    <property name="assembler" ref="assembler"/>
    <property name="namingStrategy" ref="namingStrategy"/>
    <property name="autodetect" value="true"/>
</bean>
The JMX console of the JMiniX is reachable under the http(s)://:/mct/webshop/admin/jmx/ A click on the execute button of the clearCache() method triggers the cache clearing.

Saturday, March 7, 2015

HTTP web server for Chrome to test local web applications

If you are writing HTML and JavaScript on your PC and testing the output in your browser without setting up a server, you will probably get some error messages about Cross Origin Requests. Your browser will render HTML, run JavaScript, jQuery or AngularJS web app, and ... block requests. This is a common behavior of many web browsers due to possible cross site attacks. That is understandable because it is not clever to allow someone to read your hard drive from your web browser. Right?

More concrete example. Assume, you develop an AngularJS web app with custom directives. Maybe you're quickly testing some AngularJS features without any web server. AngularJS loads HTML files of custom directives on page initial load. All files are loaded by AJAX from your PC, so that you will see an error message such as "XMLHttpRequest cannot load file:///C:/somefolder/somefile.html". I had e.g. these messages in the browser console:


This is because your browser doesn't allow AJAX requests for file:/// Obviously, you need to serve all files over HTTP. The simplest solution would be to install a web server for Chrome. It serves web pages from a local folder over the network, using HTTP and comes exactly to rescue. After installing this small server app from the Google Web Store, you will see a Web Server icon. I have e.g. something like


Open the web server for Chrome and choose a directory to serve static content. In our case, this is the main folder of your project.


It is now able to stream large files and handle range requests. It also sets mime types correctly. Navigate to http://127.0.0.1:8887 You will see all content of you project. Assume, the main file is called myapp.html. After choosing this file, you will see the URL http://127.0.0.1:8887/myapp.html The web app is fine now because all separate files for AngularJS's includes or directives are streamed down by AJAX over HTTP. Changes in files are visible on-the-fly without any restarts or redeployments.

Have fun!