IBM MQ v8.x End of Support Options

Are you still running IBM MQ v8.0.x (or alternately, an even earlier build)?  

IBM has announced as of April 30th, 2020 it will end of support for IBM MQ v8.0.x. If you are still using that version it’s recommended that you upgrade to v9.1 to avoid any potential security issues that may occur in earlier, unsupported versions of MQ. 
MQ v9.1 Highlights: 

  • IBM MQ Console
  • The administrative REST API
  • The messaging REST API
  • Improvements in error logging 
  • Connectivity to Salesforce with IBM MQ Bridge to Salesforce
  • Connectivity to Blockchain 

What are your plans for IBM MQ?

I plan to upgrade:
It’s never too late to start planning your upgrade and upgrading to IBM MQ’s newest v9.1 is a great option. There are great new features that help you manage costs better, improve efficiency, and manageability. 
Take a closer look here at some of the enhancements.  
If you are still considering your plans, now’s a great time to speak with our SME Integration Upgrade Team. Reach out to us today to set up a free Discovery Session or contact us directly for any questions

I would like to continue to use v8.0.x (or earlier versions):

It’s ok if you’re not ready for the newest version of MQ just yet. However, it’s important to remember that without support you may not be protected against avoidable security risks and additional support costs. IBM does offer Extended Premium support but be prepared, that option will be very expensive. 
Alternatively, as an IBM Business Partner TxMQ offers expert support options. As a business partner, we have highly specialized skills in IBM software. We can help guide you through issues that may arise at a fraction of the cost with the added benefit of flexibility in services. Check out more on TxMQ’s Extended Support Page. (

I will support it internally: 

If you have an amazing internal team inhouse, odds are they don’t have much time to spare. Putting the gravity of a big project on your internal team can cut into their productivity. For many organizations, this will limit a team’s ability to focus on innovation and improving customer experience. This will make your competitors happy but your customers and clients definitely won’t be. 
Utilizing a trusted partner like TxMQ can help cut costs and give back some time to your internal team to focus on improvements and not just break/fix maintenance. Reach out to our team and learn how we can help maintain your existing legacy applications and platforms so your star team can focus on innovation again. Reach out and ask how we can help today. 

I don’t know, I still need help!

Reach out to TxMQ today and schedule a free Discovery Session to learn what your best options are!

From SOAs to Microservices – A Journey from Legacy to Modernization

Work Smarter with TxMQ Webinar Series 

Presented by TxMQ

Hosted by Arnold Shoon Solutions Director at TxMQ  

The once-common practice of operating large, monolithic software systems supported on-premises is quickly changing. With the adoption of the cloud, microservices, and the new API economy, once slow-moving organizations have been challenged to accept new methodologies like DevOps to become increasingly agile to keep up with the competition. 

Existing mainframe customers that are currently heavily invested in Service Oriented Architecture (SOA) may be challenged to find the right tools and frameworks to start. 

In our discussion, Arnold Shoon breaks down the importance of APIs outlining some of the benefits of utilizing more agile development and deployment methods and discusses some of the tools available to organizations that are transitioning into the new API Economy. 

Below you can view the complete recording as well as the associated slide deck. Enjoy, and don’t forget to give us some feedback. We would love to hear any suggestions you have or subjects you would like to see covered in future webinars. Let us know here.



[et_pb_section bb_built=”1″][et_pb_row][et_pb_column type=”4_4″][et_pb_text _builder_version=”3.19.18″]
Your request will be re-directed to our active webinar.

Emergency Changes… Are you Prepared?

[et_pb_section][et_pb_row][et_pb_column type=”4_4″][et_pb_text admin_label=”Text”]
Emergency change: when something on a system, application, or physical device is changed immediately in order to prevent an incident. It can be the result of an incident or of failed change.
Take my word for it; I’ve seen it first hand. Emergency changes are risky.
They are quick, “on the fly” production changes that usually don’t contain back out plans. For this reason, and for many more I won’t dive into just now, it’s critical to have outlined processes for emergency changes.
Risky or not, many organizations often don’t take pen to paper to write out these necessary processes to approve emergency changes. In such instances, I’ve found that people push through emergency changes as new code and skip the approval process all together. Why? Because the approval process by which to submit and review those emergency changes virtually does not exist.
No red tape and no CAB review bureaucracy… sounds like CAB-utopia, right? Wrong. The consequences are far reaching, and they might just catch up to you.
So what’s the bottom line?
Use your CAB to review each and every emergency change that occurs using an after change review. The CAB should assess whether or not it can work to prevent similar emergency changes in the future. It should strive to discover the root cause through deep analysis and should likewise explore ways to eliminate those moving forward.
And, if a large number of emergency changes occur each week, raise those red flags and dig a little deeper.
Maybe you have no clear policy for Emergency Change.
Perhaps you have not identified the true root cause of an incident.
Or maybe you have yourself a trending Emergency Change that needs addressing.
Whatever the case, whatever the cause, emergency changes come up quite a bit in Change Advisory Board (CAB) meetings. Don’t let them slide by.  Are you prepared?
Let’s start a conversation.
(Original image by Perspecsys Photos.)
[/et_pb_text][et_pb_text admin_label=”Text” background_layout=”light” text_orientation=”left” use_border_color=”off” border_color=”#ffffff” border_style=”solid”]
[/et_pb_text][/et_pb_column][/et_pb_row][et_pb_row][et_pb_column type=”1_2″][et_pb_image admin_label=”Image” src=”” show_in_lightbox=”off” url_new_window=”off” animation=”bottom” sticky=”off” align=”left” force_fullwidth=”off” always_center_on_mobile=”on” use_border_color=”off” border_color=”#ffffff” border_style=”solid”] [/et_pb_image][/et_pb_column][et_pb_column type=”1_2″][/et_pb_column][/et_pb_row][/et_pb_section]

When to Automate BPM: Top 5 Questions

As we wade even deeper into the digital economy, workload automation has become more of a necessity and less of an amenity. Customers and clients expect you to be able to produce results in real time, and a more robust BPM (business process management) automation tool can get you there. Gone are the days when BPM tools were simply helpful, now they are essential for every industry.
Banking: Your customers need to manage their accounts online and receive accurate information about their finances.
Utilities: Predicting and preventing blackouts will keep your customers satisfied and detecting meter fraud will boost your bottom line.
Sales: Monitor and adjust custom-designed analytics that help you target the right audience with the right message.
IT management: Automate data-intensive or computing-intensive processes to boost the efficiency and accuracy of your IT department.
Internal workflow: Bridge the gap between business and IT with workflow that moves more fluidly between employees and departments.
A more robust BPM tool will help you align business processes more effectively with your customer expectations. If your company is struggling to keep up with customer demands and adapt to changing technology, BPM could help your business grow.
When is it time to consider using a more robust BPM automation tool?

  1. Multiple User Access: Does your business process span departments and route to various user roles?  Are there multiple types of user authority? Are there multiple ways (fax, email, mobile app, etc.) to trigger the process?  Do you need to allow for multiple types of user access (view, update, admin)? How many users will require access?
  2. Integration: Is there a need for integration with other applications? Does the process need to access or update multiple data sources? How and where is the data captured and stored? How much data and in what form is data being passed from step to step in the process?
  3. Complexity: Does it take more than 2-3 steps to complete a process? Does the information have to be escalated to more than one approver? Are there many roles or actors involved? Are there multiple inputs that trigger the business process? Are there many exceptions? Are there many decision points?
  4. Monitoring and Analytics: Is there a need for visibility to track and monitor the progress of the process? Do you want to measure the process based on cost and time? Are there KPIs you want to view? Is there a need for insight to each of the process instances?
  5. Room for Growth: Is there a need to have the environment scalable? Based on the number of users and the amount of data and other applications interfacing with the process, is performance important? Is it important to have the environment portable to different platforms?

Companies that successfully implement BPM automation tools typically experience a number of benefits including increased productivity, improved metrics, insight into operations and reduced costs. Careful pre-planning for BPM automation will ensure a more successful and cost effective implementation.

IBM MQ For Click And Collect Retail

Few industries have gone through as much change as retailing. Every retailer in business today is dynamic. That’s the only way to survive – especially because the space continues to change rapidly with innovations such as multi-channel retailing, leaner and more responsive inventory management, and the new click-and-collect phenomena.
Click and collect is sometimes called “click and mortar” because a customer shops and buys online, but opts to collect the order in-store. It beats waiting (and paying) for the postman, it saves browsing time in the store, and shoppers can quickly research products and reviews.
Click and collect is scaling rapidly, but in order to offer the service, both the retailer and the customer must be able to access real-time stock and delivery schedules. Then, the item must be reserved for the customer at the instant of purchase.

…MQ Advanced is a mobile enabler, because of its quality-of-service message delivery, as well as the baked-in, lightweight MQTT protocol to support always-on push notifications that don’t hog battery or data.

That’s where IBM MQ for click and collect comes in, because MQ Advanced supports more reliable asynchronous queries to stock and information at all stores in the network. Thus stock is updated accurately and reliably in real-time via transaction coordination. And remember that many customers will be accessing the storefront via mobile, which creates its own set of problems. But MQ Advanced is a mobile enabler, because of its quality-of-service message delivery, as well as the baked-in, lightweight MQTT protocol to support always-on push notifications that don’t hog battery or data. (MQTT was originally designed for small, unreliable sensors for things like oil pipelines and machinery. It was adopted as the open Internet of Things standard and also powers Facebook Messenger.)
I should also point out the fact that retail is a great example of an industry sector that needs bulletproof, foolproof IT spread across geography. The same system must run in every store, it must run well, and it must be accessible by folks who aren’t always tech savvy. This has generally led to file-based solutions for retail IT, and the resultant need to batch process. But today’s systems, as noted above, must trickle-update based on real-time transactional data. That means stores must process data more quickly and move the data into the enterprise much more rapidly, and central system must update in real-time. The answer, again, is already within MQ Advanced.
Managed File Transfer is native inside MQ advanced, which means the point-of-sale files can move into the enterprise over an IBM MQ network with secure, reliable, traceable, guaranteed delivery.
According to IBM, a large US grocery retailer batch processed its data, which created a delayed analysis and made it difficult to detect theft. Using IBM MQ, IBM MQ Managed File Transfer and MQTT, and the company’s data warehouse now receives near-real-time transaction data from 2,400 different stores.
TxMQ is widely regarded as one of the premier MQ solutions shops in North America. We’ve been in business since 1979. Our customers typically start with our free, no-obligation discovery session. Want to improve your enterprise application performance? Want to know about how MQ is driving the digital economy, and how you can climb aboard? Let’s get the conversation started. Click Here For Our Free Discovery Session Offer.
(Image from DaveBleasdale)
[feature_headline type=”center” level=”h5″ looks_like=”h5″]Cool MQ-Related Content From TxMQ[/feature_headline]
[recent_posts type=”post” count=”6″ section=”mq” orientation=”vertical” fade=”true”]

MQ, The Digital Economy & You

IBM MQ continues to evolve to meet the expanding needs of the digital economy. We encounter many organizations that have yet to take full advantage of the capabilities they’ve invested in their MQ backbone. Learn about the new MQ physical appliance, as well as the MQ virtual appliance (available exclusively from TxMQ). Other topics include secure transfer, cloud messaging and more.

How IBM MQ v8 Powers Secure Cloud Integration

In this quickly growing digital economy, where we have an increasing demand on things like security, cloud and mobility, IBM MQ has been growing to meet those demands. To pick two of the three topics, MQ v8 can deliver secure cloud integration straight of the box.
It is important to know what type of cloud are we’re really talking about. Are you talking about moving all of your services into the cloud – even your virtual desktops? Or are you talking about a hybrid cloud where with a mix of cloud computing supplementing your own services? Or are you talking about a private cloud, where you’ll have segments of internal computing services totally isolated from general services. There are different considerations for each scenario.
Regardless of the type of cloud-computing services you’re using, you still need to integrate these services, and you really need to ensure that your integration has security, data integrity and the capability of sending messages once-only with assured delivery. Cloud can’t provide that. MQ can and does. And it does if out of the box with several recent enhancements to ensure secure integration.
With the digital economy, we’re all sharing all this data, including personal data, banking and health data. We need to keep this data secure when it’s being shared, and control who has access to it. Then of course there’s the large compliance piece we need to meet. How does MQ meet all these demands? The answer is authentication, and MQ’s solution is still the same as being asked for ID of proof at the post office when you go to pick up a package. MQ v8 has been enhanced to support full user authentication right out of the box. No more custom exits and plugins.
For distributed platforms, you have local OS authentication, or you can actually go to centralized data. For z/OS you’re still focused on local authentication.
And this next point is important: MQ for quite some time has supported certificate authentication of applications connected to MQ services. But this always meant that the public MQ key had to be shared with everyone. MQ now has been enhanced to support the use of multiple certificates for authentication, securing of connections and encryption using separate key pairs. MQ still support SSL and TLS, although there are strong recommendations for switching from SSL to TLS based on the POODLE vulnerability.
(Image from mrdorkesq)

MQTT Repositories Review – Mosquitto, MessageSight & More

In my previous blog (Rigorous Enough! MQTT For The Internet Of Things Backbone), I presented the MQ Telemetry Transport (MQTT) protocol, which helps provide the required communication for smart devices. But without a broker repository or destination to support the protocol, MQTT can’t complete its mission.

In this article, I’ll first review one of the open-standard MQTT repositories called Mosquitto, and then cover IBM MessageSight. In future blogs I’ll present additional information on both the security component and additional broker functionality.

Mosquitto is an open-source (BSD-licensed) message broker that implements the MQTT protocol versions 3.1 and 3.1.1. It provides a lightweight server implementation of the MQTT and MQTT-SN protocols, written in C, so it can run on machines that can’t run a JVM.

Mosquitto regularly has an executable in the order of 120kB that consumes around 3MB RAM with 1,000 clients connected. There have been reports of successful tests with 100,000 connected clients at modest message rates.

In addition to accepting connections from MQTT clients, Mosquitto can bridge to other connected MQTT servers, including other Mosquitto instances. It’s thus possible to architect MQTT server networks, and pass MQTT messages from any network location to any other.

A second repository for MQTT is IBM MessageSight, which is built for high performance to offer persistent, transactional messaging. The hardware is 2U form factor. IBM MessageSight includes built-in security to enable integration with external Lightweight Directory Access Protocol (LDAP) security systems. MessageSight also offers Transport Layer Security (TLS), Secure Sockets Layer (SSL), FIPS 140-2, NSA Suite B ciphers and Level 1 secure Crypotgraphic Store securities.

Fine-grained messaging-authorization policies restrict access based on combinations of: user or group, client identifier, protocol, network interface, listening address and/or port, client IP address or range and destination topic and queue name.

The MessageSight repository supports connectivity to WebSphere Message Broker via JMS and/or MQTT nodes. It also integrates with Java environments and with rich HTML5-based web applications. Additionally, MessageSight allows development of interactive mobile-messaging applications with IBM Worklight Studio Developer, which delivers:

  • Friendly APIs and libraries
  • MQTT clients and libraries for a variety of platforms (C- and Java-based APIs)
  • Libraries for Google Android and Apple iOS
  • JMS client
  • JavaScript API for HTML5-based applications
  • PhoneGap MQTT plugins with JavaScript API for use with IBM Worklight
  • Apache Cordova
  • Adobe PhoneGap

MessageSight also offers simple and scalable management through policies. A single user ID is defined on the queue manager for IBM MessageSight, which enables a business to sense and respond to data coming from the edge of the enterprise. IBM MessageSight offers high availability with either an active or passive standby.

There are several public repositories that include Hive MQ, which provides a repository that anyone can engage with. In addition, there is cloudMQTT, which is a repository hosted in the cloud. There are other implementations of the broker space, namely gnatMQ, which is an implementation of MQTT but specifically for.Net, and ActiveMQ, which is a product of the Apache group.

Time To Study For Your PMP – Don't Panic!

You’ve received the green light from PMI to schedule your test and you’re ready to go! Just one small thing — you have got no idea how much time you should give yourself or what you should spend your study time on.
There’s so much information out there, especially boot camp advertisements with “pass guarantees.” There are tests you can purchase, books you can buy and a fair amount of fear mongering on project management websites, as well.
I wanted to share my method because it was cheap and, for me, it worked. The test is either pass or fail; you don’t receive a percentage, but you do get a breakdown of how you did in each of the five areas: initiating, planning, executing, monitoring, controlling and closing. You’re rated as either proficient, moderately proficient or below proficient. I was proficient in four areas and moderately proficient in one. I hope this gives you the confidence to believe me when I say: You do not need to drop $1,000+ on a project management boot camp!
Ok, so what should you do? First, my suggestion is to give yourself four weeks to study. If you give yourself more time you might get in the habit of thinking you don’t need to buckle down because you have more than enough time. You could possibly do this in less time but I spent about 1-2 hours a day over four weeks. If you want to put in more time you can condense this down to about two weeks. If you read my first post you know I suggest signing up for PMI membership, and if you did this you got access to the latest electronic version of the Project Management Body of Knowledge Guide (PMBOK). This book is the Bible for project management. Now don’t hate me when I tell you this (remember I just saved you $1,000), but you’re going to need to read this. It’s dry; there are no anecdotes and no cartoons, just facts. But read it once, and then you’re done with it other than as a reference material. I promise.
Now that you have a general concept of the project phases, knowledge areas and processes, you need to memorize them. All of them. The five phases and 10 knowledge areas shouldn’t be too hard.
I used “Integrating Scope and Time Costs Quality Human Resources to Communicate with a Risk of Procuring Stakeholders” as my little reminder for the knowledge areas.
I know, it’s not super catchy; but, it’s not terrible either.
The best way to then memorize the 47 processes, from my point of view, is to memorize how many are in each column (2, 24, 8, 11, 2 across the top) and then also in each row (7 – Time, 6 – Scope, 4 – cost, HR, Procurement, Stakeholder, 3 – Quality and Communications). I stared at the processes chart and then tried writing it out from memory daily. By the start of week 3 of studying you should have this down, but continue writing it out anyways. Disclaimer: I absolutely believe in the power of rote memorization.
The next thing I suggest memorizing cold are project management formulas, I also wrote these out daily. Here’s my list:
CPI*: EV/AC (Considered the most important earned value metric)
EAC: AC + Bottom Up Estimate
Communication Channels: N (n-1)/2
PERT: (P+4M+O)/6
Activity Variance ((P+4M+O)/6) ^2
Future Value: Present value/ (1+r) ^n
Present Value: Future Value (1+r)^n
Internal Rate of Return (Benefit-Cost)/Cost
There are a few more you could memorize, but this is what I did. There’s only so much data you can get down cold and you’ll have to pick and choose.
These two chunks of data are what I included in my brain dump prior to starting the exam. This means that I spent a portion of the testing demo writing this out before taking the test – this way I didn’t cut into the time I had to take the actual test.
As far as ITTOs (Inputs, Tools and Techniques, Outputs), I did not memorize these. I worked on understanding them and being able to recognize the most common. Make sure you have a firm grasp on what constitutes an Enterprise Environmental Factor (an organization’s culture, governance and structure) and what constitutes an Organizational Process Asset (processes, procedures, and knowledge base).
There are some things you’ll need to know about project management that the PMBOK does not cover – remember this exam isn’t just about studying a book, you’re proving that you know and live project management daily. So do yourself a favor and look up these names: Deming, Fielder, Shewart, Ouchi, Juran, Douglas McGregor, Kaizen, Frederick Herzberg, Maslow, McClelland, Vroom and Crosby. These are all theorists in either quality management or behavior management and their theories have an impact on project management processes.
The other half of studying is testing what you’re retaining. There are a ton of practice exams online and a lot ask you to pay. I don’t think you need to. I really hope you got that PMI membership because they have a link on their website to something called Books24x7. It’s access to a ton of relevant reading material. Right now a book called PMP Exam Prep: Questions, Answers & Explanations, 2013 Edition by Christopher Scordo is up there. I took every test in that book and reviewed every answer, both those I got right and what I got wrong. Anything I repeatedly got wrong went on a note sheet to be reviewed daily until I did get it and anything that I didn’t recognize went on a sheet to be Googled later. After finishing this book I moved to This site has full length exams with 200 questions, and gives you the option of reviewing all questions and their answers afterwards. They also track your progress and show your scores in a nifty little bar chart and also even break down your score by project phase area (just like the real exam!). By the time I sat for the exam I’d racked up another eight exams. I would usually do one a night, with a complete review and then look at the notes I’d taken from questions I missed on past exams. I was scoring between 69% and 88% once I moved on to after completing all of Christopher Scordo’s exams.
The night before my test, I took one last practice exam, reviewed the answers and looked at my notes. Then I went to bed early (Don’t skip this part, please; it’s easy).
The day of the test I had breakfast, read my notes one last time and then arrived at the testing center 30 minutes early. All centers are different but I believe the majority recommend coming early as you’ll need to check in (you may need two forms of identification) and put your items in a locker.
Once you’re in the testing center, remember these steps: do your brain dump first, breeze through the testing demo and then focus! I wish you the best of luck on the exam!