Thursday, July 9, 2009

Curbing MESsy Shop Floor State of Affairs

Part I of this blog series expanded on some of TEC’s earlier articles about companies’ need for better links between the plant (”blue collar trenches”) and the enterprise (”white collar ivory tower”). It also pointed out the difficulties in achieving this idea. An obvious solution would be a tightly integrated enterprise resource planning (ERP) and manufacturing execution system (MES) package that would help manufacturers close the gap between the shop floor and the offices by gaining visibility into manufacturing operations, achieving shop floor control, managing product/process traceability, genealogy, and so on.

MES solutions that integrate seamlessly into existing enterprise applications thus connect manufacturing to the enterprise in order to:

* Reduce costs and improve profits by collecting and communicating real-time manufacturing data throughout the product lifecycle; and
* Closely control and continuously improve operations, quality, and visibility across facilities worldwide.

By standardizing the best practices of lean manufacturing, overall equipment effectiveness (OEE), and continuous process improvement (CPI), such solutions should provide a real-time framework that would unite capabilities like finite factory scheduling (constraints-based), operations, quality, safety, performance management (via analytics), and enterprise asset maintenance (EAM).

Plant-level execution systems have thus far largely been adopted by big companies in a big way. The historic condition in this highly fragmented market was that offerings were too niche-oriented and offered by many small software companies. A large enterprise would have to purchase many offerings and stitch them together to get a full solution. Today, however, comprehensive packaged factory solutions that are repeatable, scaleable, and transferable are changing that dynamic.

Some Shining Examples

Some good examples in this regard would be a rare few ERP vendors with native MES capabilities, starting with IQMS and its EnterpriseIQ suite [evaluate this product]. Mid-2008, IQMS launched a new Automation Group to expand the interface capabilities of its EnterpriseIQ ERP system with manufacturing equipment on the shop floor.

Look for a separate article on IQMS down the track. In the meantime, you can find more information about the vendor here and in TEC’s earlier article entitled “Manufacturer’s Nirvana — Real-Time Actionable Information.” Also, there is an informative Enterprise Systems Spectator’s blog post on IQMS here.

Solarsoft (formerly CMS Software [evaluate this product]) would be another good ERP-MES example following the acquisition of Mattec a couple of years ago. The upcoming Epicor 9 product, which represents a complete rewrite and convergence, on the basis of service-oriented architecture (SOA) and Web 2.0, of the selected best-of-breed functional concepts from the respective individual products (like Epicor Vantage [evaluate this product], Epicor Enterprise [evaluate this product], Epicor iScala [evaluate this product], and so on) will feature the native MES module. Of course, some functionality within Epicor 9 will be brand new, while some modules will represent embedded third-party products (unbeknownst to the customer).

Again, Forget Not About Oracle

Last but not least, Oracle delivered its own integrated MES-ERP-Supply Chain Management (SCM) offering within Oracle E-Business Suite (EBS) release 12. This was in great part possible due to the vendor’s existing manufacturing capabilities/modules (e.g, flow manufacturing, work-in-process [WIP], etc.) and many customer deployments in certain manufacturing industries.

More recently though, Oracle has been deeply involved in delivering its own integrated MES-ERP-Supply Chain Management (SCM) offering entitled Oracle Manufacturing Operations Center (OMOC, formerly Oracle Manufacturing Hub). Oracle hails this enterprise manufacturing intelligence (EMI, or operational intelligence, shop-floor integration, and real-time intelligence, for that matter) layer (based on the Oracle Business Intelligence Enterprise Edition [OBIEE] architecture) as the foundation for continuous process improvement (CPI) in manufacturing operations.

While the initial generally available parts of this ambitious undertaking such as contextualization engine, manufacturing operations data model (based on the ISA-95 standard), and role-based dashboards, were recently launched, look for much more to come down the track. In the meantime, Oracle points out that MOC should not be confused with being:

* A substitute for Oracle MES for Discrete Manufacturing or Oracle MES for Process Manufacturing;
* A Substitute for Oracle BI Applications;
* A data historian (like e.g., the OSIsoft PI or Wonderware Historian products); or
* A pure middleware solution (it performs some integration, but is an application suite after all).

A SaaSy ERP + MES Example

Plexus Systems deserves attention for offering broad and deep software as a service (SaaS) offering, with a zero-client footprint (i.e., web browser only) and subscription-based pricing. Contrary to ingrained beliefs that SaaS solutions are only suited for individual departments with limited functional footprints, the Plexus Online product features an impressively deep and broad range of features for industries like automotive or medical devices manufacturing.

In fact the Plexus Online solution map reveals not only a traditional ERP scope, but also SCM modules such as program management, supplier quality, Electronic Data Interchange (EDI), lean replenishment and corrective actions. While Plexus Online (and any other product mentioned here for that matter) deserves a separate blog post, some of its MES capabilities worth mentioning here are: production scheduling, quality management, tooling, statistic process control (SPC), traceability, and costing (labor and material tracking).

Plexus has extensive manufacturing industry experience, since it was developed at an actual manufacturing plant, from 1989-1995. The company has been focused on manufacturing within industries like automotive, aerospace & defense (A&D), and industrial machinery since its founding. Plexus’ initial on-premise offering was on Progress Software’s OpenEdge technology, but the SaaS rewrite in the early 2000s has leveraged the Microsoft .NET Framework technology.

Part III of this blog post will analyze the critical role of plant staff, and will also report on the current approaches and offerings from CDC Software and SAP. Your views, comments, opinions, etc. about any above-mentioned solution and abut the software category per se are welcome in the meantime.

We would also be interested in hearing about your experiences with these software solutions (if you are an existing user) or your general interest to evaluate these solutions as prospective customers.

Job Scheduling Maze in Distributed IT Landscapes

Part 2 of this blog series analyzed the ActiveBatch architecture and evolution in terms of enterprise job scheduling and workload automation functional capabilities. This was done for the three older product releases. Particularly impressive was the ActiveBatch V6 release that introduced a few noble concepts like Job Library and Virtual Root.

ActiveBatch V7 – The Current State of Affairs

The ActiveBatch V7 release (the release has completed the Beta phase, and is now in the final quality assurance [QA] phase; its release date is scheduled for January 2009) is not a slouch either when it comes to introducing new concepts like the one of Dynamic Web Services within the Business Process Automation (BPA) module. This module allows both externally and internally developed Web Services to be incorporated into workflows, and to trigger workflow steps.

These specific business objects can be created on the fly to combine ActiveBatch services (e.g., a trigger service) with any Web Services-enabled application to allow for jobs and job plans to retrieve information from external or internal data sources and applications. This feature allows the resulting job step to be re-useable in multiple workflows, and to facilitate the creation of composite applications.

The integrated and extensible Job Library capability is also pertinent here via stored routines for applications and infrastructure to developers and users and to the overall business. The library supports Microsoft .NET Framework class library and Web Services architecture, and the V7 release enables inbound and outbound integration of jobs, notifications, and triggers. New events were introduced for email triggers, such as using key words, Microsoft Message Queuing, Web Services, and more.

Helping with ITSM and “Green” Needs

In addition to its Web Services and service oriented architecture (SOA)-based integration and process modification capabilities, two other notable improvements debuted in V7. First, a Change Management/Information Technology Service Management (ITSM) system with a new user interface (UI) was introduced to allow the reliable and fast movement of ActiveBatch objects between the Development, Quality Assurance (QA), and Production departments. To make it easier to update and manage ActiveBatch objects, V7 supports the creation, comparison, modification, and updating of objects from one environment to another.

For example, objects can be moved from a QA environment to a production environment in a simple and auditable fashion. The environment enables synchronization across job schedulers in different departments and in different lifecycle phases, whereby object differences can be highlighted and color-coded. Users can select whether to apply changes later (save for approval), apply changes now (based on permissions), or to enable a controlled time window for changes.

All these capabilities, together with objects’ dependency resolutions, aim to provide greater reliability when managing multiple and diverse computing operations. Modifications to objects can be compared across revision levels to see what changes were made and by whom. The designer can elect to rollback to a previous revision should this be required, while custom callbacks for object customization can be applied to all objects, job user accounts, queues, etc.

Second, given the increasing calls for energy preservation everywhere, the V7 release introduced a “green” Power Management feature that allows idle machines to be placed in a Hibernate or Suspend state to save on power consumption and potentially take advantage of rebates or passing carbon credits. The dormant systems can be reactivated either at a specific time or via the hardware’s “Wake-On-LAN (WOL)” function.

Moreover, the separate graphical view enables monitoring of service level agreements (SLAs), in terms of service level violations, potentials for violation, successes/failures, and whether the business process succeeded even with a SLA failure. There is also integration with business rules engines (e.g., for completion rules creation and dynamic runtime prioritizations).

Rule constraints can be set in terms of jobs, files, resources, and other variables. Other features include the ability for users to query jobs’ execution state and to evaluate idle times. However, the SLA capabilities have been pushed out to what will likely be a Service Pack (SP) release after the January 2009 release date of ActiveBatch V7.

From the performance standpoint, release V7 features improved scheduling through dynamics characteristics of a historical data-driven Online Analytic processing (OLAP) database. The database is constantly populated by historical data details to develop “hints” for optimized performance (e.g., to minimize elapsed time, optimize resource utilization, etc.) and to forecast server loads.

New and/or Enhanced Job Types and Views

As an additional job type alternative (to the five types explained in Part 2), the Job Library was introduced with V6 and is being expanded in V7. The Job Library contains templates of job steps for applications such as Crystal Reports, Symantec Veritas Backup Exec, multiple database functions, ActiveBatch Task Scheduler jobs, Power Management, and so on.

The Library Job Steps can be dragged and dropped to create a workflow. The user has simply to click on the provided action template, drag it to the Job Step Editor, and complete the fields, some of which use discovery to provide appropriate choices (e.g. in those systems where Microsoft SQL Server database is installed, and it enumerates the Structural Query Language [SQL] jobs, etc.)

Also, in ActiveBatch V7 the existing two views, System View and Design View, were combined into a single graphical view. Workflow designers can use this visual approach to create and link jobs as well as plans together into “end to end” workflows using completion triggers, alert triggers.

They can also manage them by implementing Constraints or Dependencies between them using drag-and-draw operations. This view also allows operations to graphically review not only the workflows as they are triggered, but also the progress from job to job as well as any status change, to visually alert operations should there be an issue requiring actions such as a re-start, re-run, force completion, etc.

The Graphical System View lets users see the current status and immediate results of every job in the queue. Icons in the view show job dependencies and workflow, and jobs are color-coded based on the current run status and the last execution result (e.g., a purple color denotes a failed job).

Why ActiveBatch Wins Customers

ActiveBatch’s parent, Advanced Systems Concepts Inc. (ASCI), touts the differentiating features like ActiveBatch Mobile Management (with full remote management of the system from SmartPhone and Personal Digital Assistant [PDA] devices), incorporating SOA for increased levels of integration, change management, etc. In addition to being inexpensive (an average price of deployment is about US$ 35K), ActiveBatch is also relatively easy to deploy and use because its architecture includes many key components that are typically already deployed in the customers’ IT infrastructure.

Included in this set of supported technologies are the following: Windows Security Model, Active Directory and Active Directory Application Mode (ADAM), Microsoft SQL Server and Oracle databases, Windows Management Instrumentation (WMI), Simple Network Management Protocol (SNMP) trapping, and so on. This familiarity-based ease-of-use, script language independence, and the ability to integrate to existing IT infrastructures (either via Microsoft Common Object Model [COM] or Web services) reduce the learning curve and the overall cost of operations for customers.

ASCI admits that the customers’ decision to implement ActiveBatch over a competitor rarely comes down to a features/functions-based selection, but rather on the overall issues like the ease of installation and use of the product, price, scalability, and reliability. The vendor often finds that, when users look at products like ActiveBatch, they have immediate needs, longer-term needs, and some conceptual needs (i.e., “We like that, and may find use for it in the future, so it is nice to know that it is there”).

Compared to many of the competitors mentioned in Part 1, especially those that originate from mainframe environments, ASCI’s decision to directly incorporate technologies in ActiveBatch (rather than to simply tolerate or retrofit them) gives it a strategic advantage. For example, the technologies like Windows Security Model with the Kerberos authentication protocol, ADAM (and even enhanced directory services for non-Windows platforms), cluster servers (both Microsoft’s and Symantec Veritas Cluster Server), Oracle and SQL Server databases, and so on are all integral to ActiveBatch, but rarely to other vendors’ offerings.

When it comes to scalability, as already emphasized in Part 1 and Part 2, ActiveBatch has tested up to 2,000 server connections and over 1.2 million jobs in 24 hours. Such architectural limits are showstoppers for many if not all of competitive products. In addition, the Virtual Root capability for multiple occupancy of ActiveBatch Job Scheduler (which was introduced in V6 and explained in Part 2) allows for full and secure (via directory services) transparency between users, departments, divisions, etc.

In addition, ASCI has adopted a sales model of disintermediation, thus centralizing its operations with aggressive use of the Internet for working with clients around the world from its United States (US) headquarters (HQ). This includes utilizing both the ActiveBatch sales and support experts, and customers’ direct access to the engineering group.

The net result is a more functional product and better customer service through improved information flow with dramatically lower costs of sales and support, where the savings can be passed onto clients. Today’s Internet allows precisely for this new approach to selling and supporting enterprise applications.

Going SaaS/On-demand or Not?

This brings us to the fact that the current ActiveBatch architecture lends itself well to the software as a service (SaaS) model. In fact, ActiveBatch Job Scheduler supports multi-tenant private/secure occupancy through the Virtual Root capability. ASCI believes that ActiveBatch’s Web-based (thin) client (in addition to mobile access to the system, and powerful backend database offerings of Oracle and SQL Server) can handle the required performance. The rapid and lightweight deployment strategy for the ActiveBatch Execution Agents makes it also easy to address firewall and other requirements.

So, while ASCI does not offer a SaaS option for ActiveBatch at this time, it understands the issues, and is researching how and when this approach to the market can offer the greatest rewards for the vendor and its clients. ActiveBatch’s focus for its upcoming release is also focused on SOA-based integration as opposed to business process management (BPM).

The business process orchestration that includes both manual processes and human intervention, as used in the BPM suite sense is not the market ASCI is going after, at least not now. The ActiveBatch V7 release will certainly integrate with other SOA governance/BPM suites, and is compatible with many SOA/BPM applications.

Parting Conclusions

Despite its functional and architectural superiority (in terms of system availability and performance benchmarks, and versatile administration features), and the price advantage (being usage-base licensed by the number of jobs per month, while Web hosting reduces the cost of operations for the vendor), the customers’ word of mouth still largely sells ActiveBatch. The company needs to invest in much more aggressive marketing and prospective customers’ education, despite the large prospective audience and a wide open playing field.

Based on data collected over the last several months by ASCI, which could provide a reasonable indicator of ActiveBatch uses, two major deployment areas have been for Data Warehousing and Business Intelligence (BI), and then Enterprise Application Integration (EAI) of new, legacy, and/or packaged applications.

One possible fertile ground for ActiveBatch to pursue would be to educate prospective users to better understand and consider the addition of powerful scheduling and automation capabilities to their existing enterprise applications sets like supply chain management (SCM), customer relationship management (CRM), BI, etc. These commercially available applications might require products like ActiveBatch to allow them to be integrated into and across workflows as well as to add scheduling techniques to the more rudimentary scheduling found within these applications themselves.

A good case study would be a semiconductor manufacturer with 15 disparate enterprise resource planning (ERP) systems in its plants running on Microsoft Windows and Sun Solaris platforms. An overarching SCM product from i2 Technologies is used to create an overall multi-site master production schedule (MPS).

Generating a corporate-wide MPS involves processing more than 100 jobs, all of which must be performed in a precise sequence. Given so many dependencies, the failure of one job would affect the true optimization of the final MPS plan; it is thus better to repeat the failed job (after investigating and fixing the cause) then to get the suboptimal overall plan. ActiveBatch came in handy in that regard, since the manufacturer attributes to it to a 50 percent reduction of the planning staff, 30 percent in inventory reduction, and 10 percent in cycle time reduction.

To that end, ActiveBatch might benefit from having ready-made connectors to leading ERP products. Its competitor Redwood Software has long had a custom solution using a “connector” aimed specifically for SAP Central Process Scheduling and SAP Solution Manager. ActiveBatch might provide a more powerful framework, but not the SAP connector at this time. The vendor would currently use its SOA Dynamic Web services or Command Line Interface (CLI) for a general interface to a SAP.

ActiveBatch’s plan is to develop an SAP connector for ActiveBatch to be released in 2009. Meanwhile, the ActiveBatch service library allows the vendor to connect applications using web services, expose the methods, and make use of their properties. The created service is then associated with a job as part of the Job Library. The goal is to allow any application or system that can use web services to integrate and trigger any other application.

Therefore, dear readers, what are your views, comments, opinions, etc. about ActiveBatch’s applicability, and abut the enterprise job scheduling and workload automation software markets in general? We would also be interested in your experiences with this software category (if you are an existing user), and your general interest to evaluate these solutions as prospective customers.

01 ECM vs. EIM

The IT industry is constructed of three-letter acronyms (TLAs). However, the total number of possible three-letter abbreviations using the 26 letters of the alphabet is only 17,576. This explains the stars-wearing-the-same-dress types of incidents in the IT world. When Sherry Fox discussed ECM and EIM, the acronyms represented enterprise compensation management and enterprise incentive management respectively. In this blog, the two “dresses” are worn by two different stars—enterprise content management and enterprise information management.

In Sherry’s blog post, the two TLAs are logically parallel. The concepts of enterprise compensation management and enterprise incentive management are both from the area of human resources (HR), with perspectives that overlap as well. In this blog, ECM and EIM, however, are on two different levels. To be more specific, the concept is that ECM is an important building block of the broader EIM perspective.

So, What Is Enterprise Content Management?

The Association for Information and Image Management (AIIM) defines ECM as

the strategies, methods, and tools used to capture, manage, store, preserve, and deliver content and documents related to organizational processes. ECM tools and strategies allow the management of an organization’s unstructured information, wherever that information exists…

According to this definition, content and documents are the objects being managed. This factor differentiates ECM systems from most of the other management systems. Not quite clear what I’m talking about? Taking a look at the two categories of data in the enterprise environment will help.

Not very precisely (but practically), data is divided into structured data and unstructured data (for more information, see Wikipedia’s entries on structured data and unstructured data). Different types of management systems have different coverage of the two categories.

For example, enterprise resource planning (ERP) systems generally focus on structured transactional data. On the other hand, ECM systems have more focus on managing unstructured data (e.g., natural language text, images, videos, documents). In addition, there are also systems covering both—for example, product lifecycle management (PLM) systems. You can find out more about ECM by visiting TEC’s Content Management System (CMS) Evaluation Center.

And, What Is Enterprise Information Management?

Compared with ECM, EIM is more difficult to define, due to a lack of consensus in the industry. Wikipedia defines EIM as “a field of interest… in finding solutions for optimal use of information within organizations.” So, understanding “information within organizations” is a prerequisite for understanding EIM.

In today’s enterprise environment, information is hidden behind both structured and unstructured data. Let’s take a look at a simplified example. In order to make a decision on an advertising budget for the next season, a marketing manager will probably need the following information:

* a sales analysis based on sales history data (structured data)
* a production plan based on manufacturing planning data (structured data)
* demographic information in the database or in a document (structured or unstructured data)
* a report from a marketing consulting firm in PDF format (unstructured data)
* sales and marketing meeting notes in Microsoft Word format (unstructured data)
* and so on…

This list could go on and on, but the main idea here is to show that decision makers (as well as people in daily operations) need information from different sources.

On the structured data side, thanks to business intelligence (BI), today’s business users have a powerful and efficient way (relatively speaking) to navigate through the vast data generated by the many management systems in use and receive good quality information.

For unstructured data, ECM is the usual tool of choice.

Simply speaking, EIM is another layer on top of BI and ECM, or an integrated combination of BI and ECM that facilitates the generation and use of valuable information from various data sources.

Besides ECM and BI, other technologies (such as master data management [MDM], metadata management, and enterprise portals), IT infrastructure strategies, and IT governance are also parts of the EIM equation.

Is EIM Just Another Marketing Buzzword?

Anyone who answers “yes” has probably had the experience of seeing a vendor use EIM as a marketing tool to repackage ECM or BI offerings. This reminds me of a similar situation where a product data management (PDM) vendor switched its label to PLM overnight without changing the actual offering.

Although there are different opinions about what EIM should be, it is certain that ECM (or BI) alone are not sufficient to meet true EIM requirements. Similarly, simply packaging the two together will not make a good EIM solution. Here’s my take on EIM:

* EIM is a requirement due to the way the enterprise information environment is constructed—different technologies in use, multiple systems covering fragmented business processes, communication barriers creating information silos, and so on.

* EIM is a strategy that affirms that data connectivity and transparency will enable users to obtain more valuable information in order to improve business insight.

* EIM is a process that combs through massive data sources and ties together all the relevancies where and when they are needed.

For further reading, I recommend Enterprise Information Management: Information Virtualization for a Unified Business View. Created by EMC, this is one of the best EIM documents I have found on the Internet.

Now for a little fun. In the IT world, are there any interpretations of ECM or EIM other than the ones Sherry and I have discussed? Please let us know in the comment section of this blog…

May a New Day Begin for Mature Enterprise Applications

Part 1 of this blog series outlined the trend of enterprise applications vendors’ attempts to win their users’ hearts and minds (as well as wallets) via more intuitive and appealing user interface (UI) and user experience (UX) design. What that means is that users can now more quickly obtain all of the relevant information they need in a personalized way, with drill-downs and other slick navigational Web 2.0 gadgets.

For users, personalized screens and forms provide immediate access to issues that require immediate action or reassurance that situations are under control. Such intuitive UI allows users to diagnose the most critical business situations they face and immediately drill into the source transactional systems to get the data they need and decide on appropriate actions.

The analysis then focused on Infor and its Open SOA framework, which is the enabling linchpin for the vendor’s delivery of next-generation interoperable value-adding solutions. About two years ago, Infor espoused its so-called “Three E’s” strategy (“Enrich, Extend & Evolve”) to deliver agile and adaptive software components on top of the Infor Open SOA platform.

Infor’s “Three E’s” Approach

The “enrich” part of the strategy refers to adding value to Infor’s raft of current products (solutions or assets). Infor has released over 100 product upgrades and feature (service) packs free of charge for customers on active maintenance contracts. It is also important to note that there is no forced march imposed upon customers here; these feature packs can be enabled or disabled by turning the appropriate switches “on” or “off” in a parameterized setup.

The “extend” part of the strategy refers to extending functional footprint via OSGi standards–based interoperability within Infor’s portfolio of applications in order to meet the growing complexity of global supply chains. Customers will receive ongoing service-oriented architecture (SOA) integrations. On one hand, these product connections represent cross-selling opportunities for Infor, on the other hand, they should also enable customers to extend their current solutions and build a broader foundation for future capabilities that might be required.

For example, Infor’s enterprise resource planning (ERP) users will be able to leverage, e.g., Infor’s supply chain management (SCM), business performance management (BPM), or enterprise asset management (EAM) products. But in contrast to the “extend” feature packs (and new individual product releases), these new functional capabilities are logically available for an additional license fee.

Finally, the “evolve” part of Infor’s Open SOA strategy follows along the lines of developing brand new products that will solve some particular business problem and improve users’ competitiveness (and thus will not become obsolete for quite some time). These new components promise to feature universal interconnectivity to major Infor products.

Depending on their nature, they will either be free of charge (e.g., Infor MyDay) for eligible customers or for a commensurate license fee. For more details, see TEC’s previous article entitled “Ambitious Plans and Promises: An Enterprise Software Provider Keeps Its Word.”

At the Inforum 2008 user conference, Infor touted about 20 new “evolve” components to be generally available (GA) , i.e., tested on limited release with real customers for several months) by the end of 2009 (and many more to come afterwards). In addition to the MyDay role-based portal (which will be described soon), are Infor Decisions and Infor Order Management are already GA.

Infor Decisions brings real-time, enquiry-driven business intelligence (BI) to line of business (LoB) managers. These folks have been inundated with data coming from disparate sources such as financial management systems (e.g., actual vs. budget), customer service systems (e.g., customer and product profitability), external systems (e.g., a customer’s financial performance) and operations (e.g., inventory status).

But this overflowing data has unfortunately not traditionally been linked to the context of business. To that end, Infor Decisions drives a “train of thought” inquiry, which transforms users from transactional to information workers and facilitates informed decision-making and action.

For its part, Infor Order Management provides multi-model pricing and time-phased inventory reservation right across supply chains. The solution was built to enable the true and unified order experience, i.e., how companies really sell and how customers buy (and not how the system “thinks” the trade happens).

For example, order capturing and inventory reservation can take place centrally, while the actual delivery and customer service takes place in a certain local division. Infor Order Management was designed with flexibility in mind to accommodate ever-changing business practices.

Evolution Continues

The upcoming Infor Advanced General Ledger (formerly also known as Multi-Books Accounting) module is an “evolve” component that should give global companies the ability to conform to multiple, country-specific accounting standards and currencies. The module can either run concurrently with an existing general ledger (G/L) system or serve as the primary accounting module.

The idea behind the multi-books accounting capability is to enable the system to work alongside financial management systems to help companies cast their financials in multiple ways. If, for instance, a corporation has an operation in China, India, or Brazil, and it has to follow these governments’ rules on what one precisely refers to local accounting concepts and regulations, like “salary,” “wage,” or “value-added tax” (VAT) or “sales tax,” how do users get a system without having to rip and replace what they already have in order to work in China, Latin America, the US, and Europe? Advanced G/L and about a few dozen other upcoming “evolve” components, such as, e.g., Pricing, contracts & promotions; Actual costing; Multi-echelon inventory control, and Sales & operations planning (S&OP), are slated for delivery by the early 2010s.

So, What’s The Big Deal with Infor MyDay?

Typically, when I attend vendors’ annual events, I ask their staffers to tell me what in their mind is the highlight of the conference. I was a bit dismayed after hearing the strangely named MyDay feature as the major theme of the Inforum 2008 event.

Namely, IFS Enterprise Explorer (IEE, part of the ongoing Project Aurora), Microsoft Dynamics Client for Office (DCO), Lawson Smart Office, Epicor Productivity Pyramid, IQMS Smart Page UI and so on revolve around themes like role-based portals, contextual analytics, KPIs, alerts, dashboards, shortcuts, favorite/recently used pages, etc. In addition, the role-based UI was implemented with common controls and gadgets, and delivered for basically all of the Microsoft Dynamics enterprise resource planning (ERP) products after introducing it and testing first in Microsoft Dynamics GP.

Thus, I wondered what I was missing at the time within MyDay that was making me a bit indifferent (and why I should not have been indifferent). To be fair, Infor MyDay is designed to deliver persona-based content to over 150 roles Infor has identified in its customers’ businesses. Infor’s blog post explains as follows:

“…What do we mean by persona-based? A persona is a composite of a user within an organization. A lot of vendors talk about role-based interfaces. A persona takes this concept to the next level. A role is generic, designed for a departmental role such as the “finance user.” A persona is specific to an individual user within that department, such as the VP of Finance or Controller. A persona also adds texture to that individual. At Infor, we’ve given them names and faces and built stories around their life. These are imaginary people, but they are based on the hundreds of users we studied to understand the real needs that real people need to get their jobs done. From conception, design and development to sales education and marketing, this gives us the understanding we need to build and deliver great content for people.

Let’s take a look at one of the 16 persona roles we are delivering with this first release, Bob the Production Planner. Bob is a composite of the typical production planner. He is the choreographer of the manufacturing shop floor, managing planning and production. He determines what to produce, how much and when it’s needed. He acts as the go-between between the shop floor and the corporate side of the firm. He has a degree, probably business or engineering, and about 10 years experience with manufacturing. He knows how to use applications but he’s not an IT gearhead.

Bob has to deal with unexpected events – late purchase deliveries, machine downtime or last minute work orders. He wants to be more proactive, but the reality is that he is in ‘reacting mode’ much of the time and plans are always changing. He has to deal with inaccurate inventories and bill of materials, and he has an avalanche of unstructured information that he needs to gather, format and assimilate to take action on.

From our research, we have learned Bob’s typical responsibilities, his skills, his working environment, pain points and goals. We have learned how he uses his ERP software, the other applications he uses and the value he needs to get from them. We learned all of this because we’ve done our homework. A lot of it. We started in early 2007, logging thousands of hours of research into the personas of the people using our software. We’ve built-in the content they need to make their lives a little easier, so they can focus on strategic activities instead of looking for information…”

While an impressive and thorough exercise, persona-based profiling (and subsequent UI tailoring) is not necessarily a unique practice. Namely, TEC’s recent article entitled “Application Giants in Duel—and Duet—for Users’ Hearts, Minds … and Wallets” explains at great length Microsoft’s rationale for its elaborate approach to UX, including role centers (based on numerous interviews of real-life users and their needs).

It’s About Making the Users’ Day (and Less about Impressing Analysts)

The just-announced availability of Infor MyDay for Infor ERP Adage [evaluate this product], the renowned process manufacturing ERP solution, has given me an answer to my quandary. After Infor MyDay was unveiled at Inforum 2008, this represents its first GA for one of many Infor ERP solutions.

As the recent Infor blog post explains, one issue that almost all of the process manufacturing companies can relate to is cost to service customers. In most process ERP systems, actual cost of production, post invoice rebates, disallowed discounts, non-salable allowances, and return data are just some of the data captured over time.

“…This data often resides in ERP modules or disparate, standalone systems. Most companies struggle to pull this information together with customized reports, spreadsheets or complex general ledger allocations. The problem is, by the time you sort through all the noise, the information is months old and the impact is diluted.

With Infor MyDay, the information is immediately at your fingertips, without having to call IT to develop a custom report. This information is built into the Infor MyDay personas for finance and sales managers, who receive these reports on their personalized page and can now better control cost to service and ensure their most profitable customers get the most profitable products…”

For all this time, I was thinking as an industry analyst (rather than a user of a specific product), and comparing MyDay to what other vendors are doing, thinking about the possible market differentiation. On the other hand, to a user of an aged Infor ERP Adage instance, which has been a very functional product but with a rudimentary UI (to put it mildly), MyDay will likely feel like time travel to at least two decades in the future (or as if they were participants in ABC’s Extreme Home Makeover show).

And who might then care about what other vendors might be doing in this regard? The Infor ERP Adage MyDay Datasheet is available at the company’s web site here.

Now I get that MyDay, being free of charge and exhibiting a unifying, dynamic, and snazzy UI, should resonate with Infor customers on antiquated and diverse products. The UI enhancement has also very recently been made GA for Infor ERP LN, Infor ERP SyteLine, and Infor ERP Visual to provide these users as well with visibility into the “why” and “when” and not just the “what” of business operations. As mentioned in Part 1, Infor MyDay does this by filtering data by job function and relevancy and delivering it in a condensed home-page format.

Dear readers, what are your views, comments, opinions, etc. about the concepts of improved UI/UX in general, and about Infor MyDay per se? Are these capabilities worth staying on a maintenance contract for (or being reinstated on one)? What do you think about how Infor will fare against its formidable competitors in light of its lofty strategy and recent concrete moves?