Able-One Blog

Eden Watt

Executive leader, project director (PMP) and consultant with 25 years’ experience working with high performing teams to deliver business-transforming software and multi-channel digital solutions. Creative, out of the box thinker with awards for photography, video production and writing.

Recent Posts

Modernization and The Third Platform

By Eden Watt, Vice President, Application Innovation, Able-One Systems

Gartner, IDC and other industry analysts produce compelling research which drives organizations towards technology innovations which transform business. A key concept in this research involves what they call the Third Platform, also known as SMAC (Social, Mobile, Analytics, Cloud), which is defined as consisting of the inter-dependencies between mobile computingsocial media, cloud computing, big data/analytics, and now the Internet of Things (IoT).

According to this terminology, the first platform refers to mainframes, which began in the 1950’s and continue today, for example with IBM i, z, p servers. The second platform represents the client/server architecture, which became popular in the 90’s and also continues today with desktop applications interfacing with back-end servers (including ‘mainframes’). 

Figure 1 below is a graphical representation of the Third Platform, from IDC. For a great summary of this material, including videos, check out http://www.idc.com/prodserv/3rd-platform.

 3rd-platform.jpg

It’s important to note that if your platform is considered ‘first’, this doesn’t necessarily mean it needs to go or be replaced, however, it’s important to consider how one incorporates newer capabilities and technologies with existing applications and infrastructures. This leads us to the ever popular topic of modernization.

Modernize means to enhance something to make it current. Some organizations may not consider their projects to be ‘modernization’ projects but if you’ve extended your enterprise systems to enable mobile or web interfaces, perhaps enabling customers to order online, perform web inquiries, fill out e-forms… or used web services to automate business processes or interfaces between systems… or upgraded your DB2/400 database to SQL… or mined your data sources to perform business intelligence or advanced analytics… or provided more advanced tools for your developers… or re-engineered business processes to reduce manual steps or increase automation… well, you get the idea. If you have enhanced your business applications and application management in any of these areas, then this is typically considered a modernization project.

Some object to the term because it implies that what you’re starting with is old or out of date. And we typically refer to modernization of ‘first platform’ systems as legacy systems or some, who do not want to offend, have called them heritage systems. The bottom line is that if the core of the application, including the user interface design, database, and application code was developed in the 90’s or earlier, then it fits in this category. It’s not a bad thing and it doesn’t mean that it hasn’t served the business well, both in the past and today, however, it’s probably a safe bet that the original architecture was not designed to take advantage of current technologies in the “third platform” which is important to remain competitive and support your business goals.

The specific business drivers for a modernization project can be varied, ranging from the need to deliver applications to customers and partners, integrate applications after acquisitions or line of business changes, reduce manual, paper-based or redundant processes, deliver operational efficiencies and workflow improvements, streamline, automate, assist executives in analyzing business trends, enable roaming employees to do their jobs from mobile devices and so on.

The technology evolution we’ve seen in the 2000’s is leading to, “Nothing less than the reinvention and continuous transformation of every industry in the world,” according to IDC’s Chief Research Officer, Crawford del Prete. 

So, whether you are reacting to a specific, upcoming business challenge and related modernization project or are working with your organization to plot a strategic direction for the future, IT leadership is critical to business success in this Third Platform Era.

On December 2, 2015, we’re hosting an event for Application Managers to collaborate on the latest strategies and technologies for modernization on IBM i.

Modernization has been stated as a top concern with organizations, particularly those with mission-critical applications on IBM i servers, in numerous studies. IBM i has a solid reputation in areas such as low cost of ownership, power, scalability, reliability, and security but because of its initial, widespread growth in the 1990’s via software applications developed in that timeframe, many organizations today are working with systems that may need to be updated or extended to take advantage of current technologies.

The key goal of any modernization project must be to leverage your investment in key enterprise systems by extending them to align with current business goals and technology innovations.

The IBM Power Systems running the IBM i operating system, previously known as AS/400 or iSeries, can continue to be a strategic platform to operate your business for years to come but we believe ongoing application modernization, extensions, and/or retooling are key to ensuring that the rest of your organization believes this too.

However, modernization is used to describe a wide range of tactics and strategies and not every approach makes sense for every organization. We will continue to share customer stories, the latest technologies and strategies on this blog, at our events, and in our dealings with our customers. If this is of interest to you, we’d love to see you at our event which we plan to make both informative and collaborative so you can determine what make sense for your business.

To explore options for modernizing your applications, join us in-person for a special event on Wednesday December 2, 2015 from 12 PM to 3 PM in Toronto. Register today.

Register Now!

Topics: Modernization

To SQL Or Not To SQL? That Is The Question That Faces Today’s RPGLE Programmer

By Chuck Luttor, IBM i Software Developer at Able-One Systems inc.

SQL has been a standard for relational database management and access since the 80’s across platforms, and it has been offered for IBM i for almost 20 years, however it is still not in use by many IBM i shops. This is because DB2/400 was originally released with DDS and it wasn’t until the early 2000’s that SQL started to perform better on IBM i.

In the past 15 years, IBM has invested heavily in SQL on IBM I and is incorporating all new advances in the database into SQL.

On top of that, IBM provided the Generate Data Definition Language (QSQGNDDL) API to generate the SQL data definition language statements from DDS years ago and there are many complimentary tools available to make this easy.

So, why are so many shops still using DDS? After all, how can we, as IBM i developers, hold our heads high and claim that we are doing the best jobs possible if we are missing out on the last 15 years of DB advances which IBM has incorporated into SQL for i? We can be much more productive by using all the great additional capabilities that today’s SQL provides. As an aside I recently heard Frank Soltis state that almost all DB enhancements for the i were made to SQL and that with only a few exceptions, DDS has not been enhanced since 2000. I will take his word for it.

As an RPG programmer, I believe that it is not only desirable but necessary to replace both DDS and RPGLE op codes with SQL Data Definition Language (DDL) and SQL input/output statements. However they are separate steps within the SQL project. Which is best done first? DDL redefinition of the existing database in and of itself is useful only for some hardware performance gains. SQL I/O in RPGLE programs can help programmers be more productive. So the latter step should come first because they are much more valuable and costly than the hardware. And that step can be accomplished by the programmers themselves without anything more than management’s agreement and a measure of initiative. How many IT projects are that low cost and high return?

To achieve the next step, DDL database redefinition, IBM has kindly arranged the i’s Database so that this can be done without any recompiling of the existing RPGLE programs. DB management tools are available to automate the tedious job of creating SQL to update and reformat DB objects by providing “select the options” GUIs which provide both documentation/cross reference and promotion tools. DB administrators still not required. However, that topic is for another blog.

My Plan

It will be my task in this blog to demonstrate beyond any reasonable doubt that the average RPGLE developer can quickly become proficient in replacing RPGLE DB operation codes with their SQL equivalents when new programming is undertaken.

In each installment of this blog I will visit an op code or set of op codes in order to prove my contention. I used the IBM i which was handy for me, to generate and test my examples. It is at V7R1 with the latest Technology Refresh level and my example code is fully free-format. I believe that every RPGLE programmer will understand the examples even if they do not yet have access to V7R1 and/or full free-format.

CHAIN vs SELECT INTO

First up today is CHAIN. I remember this op code from System/3 Model 6 and Model 10 disk days. Yes I have been around for a long, long time. It has been used extensively by every RPGLE programmer since then. It is the basic op code for random access. In the “old days” it was used extensively to access disk records by relative record number as well as by key. Probably no longer.

Let us discuss its keyed disk access merits vs the merits of its SQL equivalent, SELECT INTO.

Part 1 - File Definition (see part1 in col. 1 at line 000105 of example 1)

Explicit definition of files is required by RPGLE and not required by SQL. In fact each SQL SELECT INTO can specify its own lock and isolation parameters which we will review briefly in the SELECT’s clauses. If CHAIN(N) and UNLOCK are also used then this is a wash as far as locking is concerned.

Part 2 - Data Definition (see part2 in col. 1 at line 000109 of example 1)

I, for one, always define my normalized records with an external data structure, that way in debug I can see the whole record with one eval command. But this is not necessary in RPGLE and is necessary in SQL if we are to conveniently access the row’s data. Otherwise, we must individually specify each column. Now if we have a 1000 column row which is quite possible in a VIEW or LF with extensive joins then it may well be to our program’s performance advantage to specify only the columns we want if they are few enough. Beyond a dozen columns or so, I say use a data structure, my time is valuable. Now such a VIEW or LF could save many CHAINs and SELECTs.

Although the logical purists among will argue that this is a requirement of SQL and not of RPGLE, I say that there is no meaningful program which does not require some debugging of input/output and therefore this DS is also a requirement in RPGLE. I say part2 is a push (betting term for a draw) between opponents.

Part 3 – Record or Row Access (see part3 in col. 1 at line 000121 of example 1)

We have arrived at the heart of the matter. Compare, if you will, the CHAIN statement at line 122 with the SELECT INTO statement stating at line 127. Clearly the SELECT is more complex and for the unfamiliar will require some learning. Now also consider the effort that is required with CHAIN to accomplish the same things that this SELECT statement is capable of. We will consider each row as shown below.

Keyword

Parameter(s)

Observations

select

*

I think we can all learn immediately that “select *” reads all the columns of a TABLE or VIEW and that we can specify individual fields separated by commas, just as easily. For example: “select vendno, vendname”. This is very similar to RPGLE where by default we read all fields and must specify the input record fields if we wish to read a subset.

into

:vendds

Host program variables are distinguished from SQL variables/column names by putting a semi-colon in front of them. Not at all different however than an op code with a data structure as its result. But consider, if you are accessing a master record in order to place some field(s) into a transaction record if and only if the master file row with the specified key exists then you could do this “select vendname into :transvname where vendno = :transvno”.

from

VENDFILE

Usually we would use the IBM i object location rules. Then this would be taken as *LIBL/VENDFILE. We could specify LIBRARY/FILE also. We could opt for SQL rules but that is unlikely.

where

vendno = :vendno

This is the KLIST or as in our example CHAIN, the key parameter. A bit of extra keying but surely no effort to learn. Again this where clause can be very powerful. Many IF statements following the CHAIN can sometimes be replaced.

For example, suppose there were many types of vendors and we needed to add transaction fields only for type ‘A’. Consider “select vendname into :transvname where vendno = :transvno and vendtype = ‘A’”. But keeping it simple the where clause is really the equivalent of an IF TRUE statement.

with

NC

Simply, this means no commit control and no lock. It is the default but may not be what you want when concurrent access is an issue (you may not want to read uncommitted rows). Each SQL statement can have its own locking and isolation parameters. If you want exclusive locking you would code “with RR use and keep exclusive locks”. Then you would need to issue an UPDATE, DELETE or INSERT followed by COMMIT or a ROLLBACK. Isn’t it time to use commitment control anyway?

Finally, omitting this clause will default the SQL statement to the commitment control and isolation level set previously. We will deal with isolation, row locking and commitment control in another installment as it is too extensive a topic to adequately discuss here.

fetch

first row only

Always include this clause for compatibility with CHAIN’s behaviour for duplicates. You can omit this clause if you want to check that the SELECT returns one and only one row. The multiple rows returned error, SQLSTATE = ‘21000’ or SQLCODE=-811, assigns variables unpredictably.

Part 4 – Exception Handling (see part4 in col. 1 at line 000134 of example 1)

For the straightforward found or not found condition is there really anything to choose from? SQLCODE will be something other than zero for any conditions other than row found and all data transfer successful. SQL does however give the programmer access to a wide range of warnings as shown above.

Exception handling will be another day’s topic. There will be no substitute for familiarizing oneself with the SQL Reference Manual and the SQL Messages and Codes Manual in the same way as the RPGLE Reference Manual. They are all available on-line.

Replacing CHAIN with SELECT INTO

In conclusion, the learning required to replace the typical CHAIN is remarkably easy (except for isolation and that is not easy in any context) and I venture to suggest that within the first 20 SELECTs a considerable majority of programmers will have it in hand.

I have kept it simple by considering only the CHAIN equivalence. If we had considered all the other capabilities of a SELECT INTO statement this article would be both orders of magnitude longer and so confusing that no one would want to convert.

Stay tuned for our next blog post, when we discuss SETLL and READE vs DECLARE CURSOR and FETCH.

If your company needs assistance with deciding on an SQL vs no SQL approach, contact us today for a free consultation.

Topics: Modernization

Tracking Gartner's Top Ten Strategic Technology Trends

By Eden Watt, Vice President, Application Innovation, Able-One Systems

The importance of information technology – not only from the operational perspective of maintaining mission critical systems – but also strategically steering the organization to face competition and meet the needs of a new generation of customers who have unprecedented expectations on how you must engage with them – is a given in today’s business world. 

To stay on top of these ever-changing demands, an understanding of where technology is going should be in the wheelhouse of every C-suite executive (not just the CIO).

To help advise customers in this area, I’ve always found tracking the Top Ten Strategic Technology Trends that Gartner releases annually to be a good starting point. Anyone who has been involved in steering technology decisions over the last five years will know that mobile, analytics, social, and cloud computing along with security concerns, have been key drivers for buying decisions.

However, as technology matures and morphs, the complexity of these requirements (and offerings) take on new meaning. Consider the chart below which summarizes Gartner’s Top Ten Technology Trends for the past 7 years. I’ve highlighted Mobile topics in Green, Analytics in Purple and Cloud in Blue to give you a flavour for how these have all evolved and in many cases become intertwined.

strategic-technology-trends

For 2015, Gartner has grouped their Technology Trends into three areas, as follows:

Merging the Real World and The Virtual World

  1. Computing Everywhere – This is an extension of the mobile evolution but encompasses computing everywhere around us, not just mobile devices but screens everywhere, wearable devices, car systems, etc.  and many form factors, screen sizes, interaction styles (touch, voice, keyboard, mouse, gesture), platforms, and architectures.
  2. The Internet of Things – This encompasses the growing world of gadgetry, displays and smart sensors,  embedded intelligence and tracking with data streams and services created by digitizing everything with four identified usage models that must be considered — Manage, Monetize, Operate and Extend. Over 50% of Internet connections are “things”.
  3. 3D Printing – is growing rapidly in key areas: 
  • BioPrinting – still emerging but can be used today for products such as hearing aids and prosthetic limbs
  • Consumer – 95% increase predicted for 2016, becoming pervasive
  • Enterprise – 81.9% increase, injection molds cut tooling costs by up to 97%, Innovation opportunities to be more responsive and agile to customer needs

Intelligence Everywhere

Computing is all around us and embedded in everything that we do so that we can’t just think about “computing” and automation but intelligence; big data is not as important as BIG ANSWERS.

  1. Advanced, Pervasive and Invisible Analytics - Analytics embedded everywhere, not just one data warehouse
  2. Context-Rich Systems – Systems that understand and respond based on who, what, when, where, how and why
  3. Smart Machines - autonomous vehicles, advanced robotics, virtual personal assistants and smart advisors are leading us to a new age of machine helpers

The New IT Reality Emerges

  1. Cloud/Client Computing - unify cloud and mobile strategies
  2. Software-Defined Applications and Infrastructure – everything is programmable, API’s for everything
  3. Web-Scale IT - Global class of computing that can deliver capabilities similar to the large cloud services providers to the enterprise
  4. Risk-Based Security and Self-Protection – security must be a consideration with a more and more digitized future, enable applications to protect themselves, security-aware application design

It’s a lot to absorb so we will try to tackle some of these topics in more depth in future posts. Just remember, much of the science fiction you enjoy have their ideas rooted in current science and technology and what is predicted in the future.  With ingenuity and a commitment to innovation, you can move your business forward to take advantage of emerging technology trends in a practical manner.

For regular updates on technology trends, sign up for our free monthly newsletter.

Topics: Strategy

Case Study: Meeting the Needs of Postmodern Architecture with Web Services

By Eden Watt, Vice President, Application Innovation, Able-One Systems

If there’s one common challenge that I hear repeatedly from IT leaders that I meet, it’s how to integrate and merge disparate applications, especially after acquisitions or line of business changes.  

The complexity of systems managed by many organizations today can be mind boggling. This happens over many years of technology maturation, especially as the organization adds new products and services, or additional business units are added through mergers and acquisitions. Rather than acquiring new software to meet the comprehensive and evolving needs of the organization, this typically involves extending the original software or working with multiple software applications for different business areas and stitching them together.

Gartner has coined the term “Postmodern ERP” to be a “more federated, loosely coupled ERP environment with much (or even all) of the functionality sourced as cloud services or via business process outsourcers”. This growing trend indicates that the days of purchasing one monolithic ERP package to service all the processing needs of the business is coming to an end. Applications that have served the business well over many years do not necessarily need to be replaced as organizations start to embrace new cloud and mobile technologies as they expand their capabilities.

Case Study: Challenger, a Transportation & Logistics Company 

The approach Challenger has taken to address their growing business is a great example of how to achieve this with the least amount of disruption to the business.

challenger

In business for 40 years, Challenger is one of Canada’s largest privately held trucking fleets offering transportation, logistics, warehousing and distribution services. With their traditional business model, Challenger started with an RPG-based software package as their Transportation Management System (TMS) from Innovative Computer Corp (ICC) and Infinium for financials. They have knowledgeable in-house developers to maintain these applications which have been stable and finely tuned to their needs for many years, especially in the FTL area of their business.

However, the Challenger business has grown significantly both organically and through acquisitions and they now offer complete supply chain including truckload, LTL, LCV and small parcel transport services, 3pl and 4pl support, intermodal, container shipping, air freight and ocean freight cargo shipping internationally.  

To service logistics and other business areas, Challenger uses cloud-based Mercury Gate which allows customers to sign into TMS and provides more customer-facing capabilities. However, their financial systems and LTL business are still handled on premise with IBM i based Infinium and ICC. In order to manage the flow of invoice information between the two systems, Challenger’s invoicing department has been rekeying information between the two systems on a daily basis. 

Recently, Challenger acquired looksoftware’s soarchitect to develop web services integration to address manual procedures for integration between their on premise and cloud-based systems. This tool is unique in that you can target the RPG-based screens to create a web services layer, reducing the need to rewrite or duplicate complex logic for interfacing with legacy systems.

After a short learning curve, Les Peebles from Challenger has been able to very quickly introduce automation in two key areas for the invoicing team which is saving them considerable time and reduction of errors. 

challenger-invoice

He expects to continue to improve the processing power within Challenger using soarchitect which has gotten rave reviews from both the users and IT.

According to Les, he says:  “I thoroughly enjoyed working with the tool and our users have thoroughly enjoyed the result.  An added benefit,” he says, “is that the RPG programmers also feel comfortable with it because instead of writing a new web service that could circumvent their application, this is a technology enhancer instead of a replacer.”  

The key to meeting the challenges of the “postmodern” world is to reuse, extend, and integrate rather than complete replacement because by the time you implement full-scale ERP replacement strategies, the world will have changed again.

Are you looking to integrate and merge disparate applications, but need help? Contact us for a free consultation.

Topics: Modernization

Written by

LinkedIn

Signup for Our Monthly Newsletter