There's new stuff! We're constantly making AWS Blu Insights better. Here are some of the notable new features and improvements that we've made to AWS Blu Insights since it first launched.
Since our latest release, we deployed 10 new versions of AWS Blu Insights that brings improvements and new features. This agility is not an option in order to let our customer move forward on their modernization projects and deliver for their customers. This new release summarizes all those improvements, new features and bug fixes. A huge thank you to the hundreds of active users for their feedback and trust.
We introduced a new assistant to help define an optimal scope for the AWS Blu Age Calibration phase. This optimal scope is found by following a few rules and using a set of metrics:
To read more about this new feature, see https://bluinsights.aws/blog/say-hello-to-calibration-scoping.
The Classification tool is improved in the categorisation of CSD. It dissociates 'CSDCommand' from 'CSD'. This dissociation helps the transformation center to process those files correctly. The Dependency processes those two file types as CSD.
In the dependencies graph, the grouping option of workpackages and labels set by the user is saved for future sessions.
SAVF extraction generates a "metadata.json" file for each extracted SAVF file, detailing information such as the filename, description, creation date, and update date. This metadata is valuable for tracking changes between different versions of the same SAVF file. By comparing the metadata files, users can easily identify added, removed, or changed source files across versions.
In Cobol, the Copy statement makes the Cobol program factorable. The Copy Replacing statement makes it generalizable. The Cobol compiler will substitute this statement with the copybook content after making string replacement.
A copybook defining a generic structure.
01 AA-DATA.
10 AA-ID PIC X(9).
10 AA-TYPE PIC X(1).
A Copy Replacing statement into Cobol program
COPY TEXTA REPLACING ==01 AA-DATA== BY ==05 EE-DATA==.
==AA-ID== BY ==EE-ID==.
==AA-TYPE== BY ==EE-TYPE==.
The resulting Cobol code after string substitution.
05 EE-DATA.
10 EE-ID PIC X(9).
10 EE-TYPE PIC X(1).
The Dependencies engines can now handle all these cases.
On AS/400, a utility called SDA (Screen Design Aid) to facilitate a menu creation. AS400 accesses a menu using the command GO
and in the user profile using CHGPRF
command to set the default menu displayed when a session starts. This utility creates three or four files for a menu with a naming convention as follows.
Hence, the dependency analysis follows this naming convention to establish links.
The dependencies analysis engine supports multiple languages. We introduced a new capability to allow the support of all these languages in the same codebase. For example, it is possible to analyze assets with Java source files and Mainframe files. For now the UI didn’t change, although in the future, users won’t have to select a language family.
Fixed a bug blocking the transformation run to be launched when using Blu Insights Builder. Transformation runs are now created and executed as expected.
Fixed an issue where the Version manager creation failed. Created projects could also not be deleted. The creation of Versions Manager projects is now working correctly.
Resolved an issue that prevented Cyclomatic Complexity generation when the Classification file contains unexpected fields.
After releasing the new version 4.0 of Blu Insights at the last Diadem release, we took advantage of this release to raise the security bar, improve and stabilize the many new features related to GenAI. Many of you gave positive feedback on these new features, so thank you for your valuable feedback!
We are happy and proud to announce the release of AWS Blu Insights 4.0, a major new version with AI features.
AWS Blu Insights has been imagined and released with the goal of lowering the barriers to modernization, automating repetitive tasks, establishing standards, streamlining processes, and expediting onboarding. Working backwards, we built and iterated over dozens of features to address those requirements (e.g. code inventory, dependencies analysis, project management, versions management and code transformation). We tirelessly continue this investment by mixing innovation initiatives and incorporating users’ feedback on ongoing projects. Today, the raise of Generative AI (GenAI) and services like Amazon Bedrock allows us to deliver easily and rapidly imagine more features. Today, we are excited to share new features across various areas of mainframe codebase assessment, code transformation help, and project speed-up.
Thank you to all our customers for their trust and feedback to help us improve, simplify and enrich our products.
AI features are available for L3 Certified individuals.
-INC
compiler instruction is now supported in the dependencies analysis for ANSI COBOL.More details about these features can be found in the documentation, in the FAQ pages, and the What's new post.
Thank you to our customers who are using AWS Blu Insights daily to deliver mainframe modernization projects. We have mainly dedicated this new version to deliver your latest requests, e.g. performance improvements and dependencies enhancements. Thank you for your valuable feedback.
prefs.COBOL.timeoutParser)
, to let you move forward on the mass transformation of your codebase without getting blocked on specific files (e.g. large files with over 30k lines of code) which may need further investigation.INDEXED
property indicates that it is a KSDS.DEFINE CLUSTER(NAME(MY.VSAM.CLUSTER) -
INDEXED -
CYL(15 3) -
RECORDSIZE(18 18) -
KEYS(12 0) FSPC(09 10) -
CISZ(4096)) -
DATA(NAME(MY.VSAM.CLUSTER.DATA)) -
INDEX(NAME(MY.VSAM.CLUSTER.INDEX))
PCB TYPE=DB, *
DBDNAME=dbd-file, *
PROCOPT=A, *
KEYLEN=99, *
PCBNAME=pcb-file
NAME
parameter along with existing parameter DBDNAME
.label PCB TYPE=DB,NAME=dbd-file
RPG
Exec Sql Delete From LIB/SQL_OBJECT;
C *DTAARA DEFINE TOTAL TOTNET
By partnering with AWS security engineers, we have elevated the security of Blu Insights to a higher level. We constantly raise the bar following first class standards. Besides the security aspect, in which we invested heavily since the beginning of the year, we have also delivered new features and fixed a bunch of minor bugs. Enjoy! 🍿
A new UI for Transformation Center project configuration is now available. This new UI will help you configure your project. This beta version contains Metadata, Transform and Generate property sets. Your feedbacks are welcome!
In Dependencies graph, users can group nodes by Labels and/or Workpackages. As a result, new nodes will represent Workpackages and/or Labels and their links with other nodes (including links between nested Workpackages).
Assembly Language is the lowest level language for mainframe before machine code. It includes ASM, MLC, MAC ALC and MACRO file types. MAC and MACRO are assembly macro, It is like a parametrized function. The cyclomatic complexity analysis includes a new assembly language analysis based on mnemonic codes. The dependencies assembly language analysis handles ten new statements, including macro calls from assembly.
We added reminders that will notify the user when their session is about to expire, prompting them to save their changes before it ends.
Users can leave Secured Spaces that are shared with them. They can no longer view or use the Spaces after they unsubscribe.
Since Java 17 migration, some runs were failing during the transformation step.
When a user wanted to join a To-dos Board where he did not have access, he had an infinite load. They are now redirected to the 404 page.
The entry point values were not the same between the dependency chart and the workpackage table.
It was possible to send multiple Blu Age Toolbox requests for the same account and tool.
NAME
of PCB
statement is now handled like DBDNAME
parameter. It leads to a link to a DBD file.
Following the launch of AWS Blu Insights in 15 regions, we continued our collaboration with our Security Engineers to meet AWS standards in terms of security and operational excellence. We also dedicated efforts to enhance product features and provide support to our customers on ongoing projects.
Users can now manage the download permissions of uploaded attachments in Codebase, To-Dos and Transformation Center projects. Attachments downloads can be allowed based on the users’ profile.
VAR
. In the previous version, we create a link for every VAR
value although the value is set only before the call statement. In this version, the following example is treated like a direct call of PROGA
.MOVE "PROGA" TO VAR.
CALL VAR.
CREATE PROCEDURE SYSPROC.MYPROC(IN INT, OUT INT, OUT DECIMAL(7,2))
LANGUAGE COBOL
EXTERNAL NAME MYMODULE;
Mainframe - JCL: when a variable contains a program name followed by parameters, the dependency analysis didn’t consider the dynamic call.
In the past few months, we collaborated closely with the AWS Security team, conducting 2 PenTests with 15 dedicated PenTesters over 9 weeks. They scrutinized the code, infrastructure, features, and 900+ APIs of AWS Blu Insights. This was an intensive period for our team, but the outcomes are outstanding for the benefits of our customers. 🕵🏽
Although we mainly focused on the PenTests, we continued to improve the product features and support our customers.
This new service allows to see at a glance the lists of the AWS Blu Age products, their documentations and distribution mode, request access to one or multiple products and follow requests and their status (see https://bluinsights.aws/blog/say-hello-to-bluage-toolbox).
In the SQL, there is a possibility of the creation of the same object from two or more places. Example: CREATE VIEW V1 FROM TABLE T1
in a file and another file contains CREATE OR REPLACE VIEW V1 FROM TABLE T2
. Both are responsible for creating the same object V1
. Initially, one of those was considered by the analysis and leading to missing dependencies link. Now it’s considered all possibilities. As the result, dependency analysis can produce the all outflow dependencies between object view object V1
and Table T1
and T2
. It's also applies to other SQL objects like Trigger, Procedure, etc.
// S001PGM='SAMPLE,COND=(16,GT,V00000)'
//TEST EXEC PGM=&S001PGM
This was not considered as a valid program name by the dependencies analysis leading to missing node named &S001PGM
. It is now fixed with a link to SAMPLE
.
Thank you all for your support, feedback and requests. We pushed the limits together to continually improve AWS Blu Insights. You will love our roadmap for 2024. It’s always day 1! 🔥
For this new release, and with a reduced team, we focused on quality and security, i.e. PenTest, BugBash and Major upgrades. We also delivered a set of improvements and bug fixes, mainly on dependencies analysis and the Transformation Center. See below for more details.
To help investigate Velocity Issues, developers can now download related artifacts (legacy files, their dependencies, the run configuration and outputs and the error details) from the issue details pop-up.
The MNUDDS files show the Data Description Specification of an Application Menu, the dependency analysis can now handle this type of files and detect their dependencies. The dependency analysis handles FILE
, FILES
and INTERFACE
statements. In the following examples, a dependency will be created to an appropriate file named FILE_ID
A File (CALL FILE_ID)
A Files (FILE_ID)
A Interface (FILE_ID)
EXEC CICS SEND
statement can deal with maps and map sets like follows.
EXEC CICS SEND MAP("MAP_NAME")
EXEC CICS SEND MAPSET("MAPSET_NAME")
Before, the dependencies analysis considered that MAP_NAME
and MAPSET_NAME
are equals to the BMS file name. This inaccuracy implies false missing nodes. Now dependency analysis handles DFHMDI
and DFHMSD
statements declaring respectively map and map set in BMS files. The following examples show how map and map set are declared in BMS files.
MAP_NAME DFHMDI SIZE=(...
MAPSET_NAME DFHMSD SIZE=(...
In RPG programs, free-form statements are written in a fixed-form code by having a space character in columns 6 and 7. From now, the dependency analysis handles those statements. In the following example, dcl-f
is a free-form statement surrounded by fixed-form statements.
C callp "PROGRAM"
dcl-f PID workstn extfile('LIBRARY/PROGRAM-ID') indds(indMap)
sfile(SFL_STOR:#RRN);
dindMap ds
Freeform statement ends with a line ending with a semi-column.
An input to specify the request reason has been added to the access request form to help the accreditation team validate new users faster.
Most resource names (datasets, workpackages, etc.) require at least 2 characters. The Excel Import warning message has been modified to reflect these requirements.
Generally, the FILE
property in CL implies a dependency on a corresponding file. For instance ... FILE(MY_FILE) ...
creates a dependency to MY_FILE
.
However, for the CRTSRCPF
and DSPFD
commands, this property should be ignored.
The following examples create no more dependency.
CRTSRCPF FILE(MY_FILE)
DSPFD FILE(MY_FILE)
Dependencies analysis already handle LIBRARY/PROGRAMID
notation to determine which program to link. This behaviour has been extended to external links in fixed-form and free-form.
dcl-f PROGRAMID workstn extfile('LIBRARY/PROGRAMID')
Now, the dependencies analysis is establishing the dependency link between trigger object and table or view objects used in the WHEN
clause in the trigger creation statement. Existing link between trigger object and the targeted table or view object still applies because of ON
clause. Example: For the below SQL statement, two dependency links will be established. First, from trigger object TRIGGER_1
to the table object TABLE_1
because of ON
clause. Then from trigger object TRIGGER_1
to the table object TABLE_2
due to WHEN
clause.
CREATE TRIGGER TRIGGER_1 BEFORE INSERT ON TABLE_1
REFERENCING
NEW AS N
FOR EACH ROW
MODE DB2SQL
WHEN (NOT EXISTS (SELECT 1
FROM TABLE_2))
Dependencies analysis now handles the creation of the SQL function object. Example: A link will create a link from the query source file to a new node named FUNCTION_1
CREATE FUNCTION FUNCTION_1;
Along with function object creation, dependency analysis also establishes the dependency link between trigger object and function object thanks to EXECUTE
clause in the trigger statement. Example: For the below SQL statement, it will form a dependency link between the trigger object TRIGGER_1
and the function object FUNCTION_1
CREATE TRIGGER TRIGGER_1
INSTEAD OF INSERT ON VIEW_1
FOR EACH ROW
EXECUTE FUNCTION FUNCTION_1();
SQL statements in the Mainframe analysis are stripped and result in the failure to create the node. It's been fixed.
Improved the determination of establishing the dependency link between Programs/Sub-programs and Copycodes for the PERFORM
statement. As per new implementation, PERFORM
dependency link will create if copycode has been included in the program or sub-program thanks to INCLUDE
statement. In the following example, PERFORM
dependency link formed between PROGRAM1.NSP
and COPYCODE1.NSC
due to INCLUDE
statement.
PROGRAM1.NSP
PERFORM S1
INCLUDE COPYCODE1
COPYCODE1.NSC
DEFINE SUBROUTINE S1
We continuously collaborate with security experts to audit the service features and infrastructure, scrutinizing every API and configuration. For this new release of AWS Blu Insights, the team put the focus on quality and security mixing bug bash sessions and penetration tests (PenTest). In parallel, we delivered a set of improvements mainly on dependencies analysis and graph manipulation.
We added Manage types to Workspace to allow users classifying UNKNOWN files that are not recognized by the Classification. Read this article for more details.
We added the support of new statements both for zOS and AS400 languages.
CCDEF files are Compilation control definition files. In those files, SINC
statements establish a link to any program source file. For the following sample, dependency establishes the link between CCDEF_FILE
and COB_PROGRAM
SINC COB_PROGRAM SAMSRCE
In CL, it’s possible to define temporary objects with OVRDBF
, OVRPRTF
or CRTPF
commands. Those objects behave like aliases for external files and other programs can use them. Now the dependencies analysis supports such objects, which avoid several irrelevant missing nodes in the dependencies graph.
In the below example, we define the alias MY_ALIAS
for a DDS file named MYFILE
. Then the alias is used in the RPG file.
Here, a link will be created between MY_CL_FILE
and MYFILE
but also between MY_RPG_FILE
and MYFILE
MY_CL_FILE.cl
OVRDBF FILE(MY_ALIAS) TOFILE(MYFILE)
CALL PGM(MY_RPG_FILE)
DLTOVR FILE(MY_ALIAS)
MY_RPG_FILE.rpgle
FMY_ALIAS IF E K DISK
CRTPF
statement has the same behavior.
MY_CL_FILE.cl
CRTPF FILE(MYFILE) SRCFILE(MY_ALIAS)
CALL PGM(MY_RPG_FILE)
DLTOVR FILE(MY_ALIAS)
It's also now possible to define those temporary objects in Cobol files thanks to the STRING
and CALL ... USING
statements
MY_COBOL_FILE.cbl
STRING
"OVRDBF FILE(MY_ALIAS) TOFILE(MYFILE)"
DELIMITED BY SIZE
INTO CL-COMMAND
END-STRING.
CALL "COB_PROG" USING CL-COMMAND
CLENGTH.
Menu-Driven Application uses MNUCMD files to link menu selection in the MNUDDS file to an action like to call a program. The dependency analysis can now handle this type of files and detect their dependencies. The dependency analysis handles the following statements. In the following examples, a dependency will be created to an appropriate file named FILE_ID
0022 CALL PGM(FILE_ID) PARM('1' ' ')
In this example, it will also lead to a dependency to a file named FILE2_ID
.
0010 CHGDTA DFUPGM(QALIB/FILE2_ID) FILE(QADATA/FILE_ID)
0011 CLRPFM EBFILEMD
0024 GO MENU(FILE_ID)
0007 RUNQRY QRY(QALIB/FILE_ID)
0004 SBMJOB CMD(CALL PGM(CIS400/FILE_ID))
We added scaling policies to our ECS configuration to adjust the number of running tasks dynamically, which led to optimizing resource utilization and cost efficiency. To ensure our end users do not experience any performance degradation, we conducted performance tests to carefully select the optimal scaling policy and the appropriate scaling metrics. Additionally, we implemented task scale-in protection to guarantee zero downtime.
JCL DD
statements are now insensitive to parameter order, like EXEC
statement in previous release.
The engine has been improved for JCL defining a lot of SQL queries and JCL using plenty of variable replacement.
In previous release of dependencies analysis, a Cobol program using a transaction defined in a CSD has a link to both nodes, CSD and transaction. We removed the confusing link from Cobol to CSD in this release. As a result, in a dependencies graph, apply show all children on a Cobol File will not bring all CSD children. Filtering the CSD and LISTCAT files to ease the work packages creation is no more useful 😊. Kindly see the improvement with following the CSD and Cobol program.
CSD_FILE.csd
DEFINE TRANSACTION(CCLI) GROUP(CARDDEMO)
COBOL_FILE.cob
EXEC CICS START TRANSID("CCLI") ... END-EXEC
Previously, the dependency link is formed from CSD_FILE.csd
to CCLI
(Transaction) object and from COBOL_FILE.cob
to CSD_FILE.csd.
After the improvement, the dependency link is formed from CSD_FILE.csd
to CCLI
object and from CCLI
object to COBOL_FILE.cob
.
SQL also having the same issue as CSD. Before, the link is formed between the file and the SQL object. As the improvement, dependencies analysis establishes the relationship between two object nodes.
In details, currently for create alias
, dependency link is formed from the declaring file to alias object and also from declaring file to table/view creation file. However, now a dependency link is formed from the alias object to the table object directly.
For the below SQL statement, before improvement, the dependency link is established from CREATE_ALIAS.sql
file to ALIAS_1
object and also from CREATE_ALIAS.sql
to the CREATE_TABLE.sql
. After the improvement, the dependency link is formed from CREATE_ALIAS.sql
file to ALIAS_1
and also from alias object (ALIAS_1
) to the table object (TABLE_1
)
CREATE_ALIAS.sql
CREATE ALIAS ALIAS_1 FOR TABLE_1
CREATE_TABLE.sql
CREATE TABLE TABLE_1;
Object to object mapping also improved for SQL statements like MODULE, INDEX, TABLESPACE, TABLE, CONSTRAINTS and VIEW. Overall, the resulting graph is easy to navigate, and it's lighter.
We now differentiate between dependencies analysis launched and imported from a JSON in the activity title.
&LIB/
was the false missing in the following example. It's fixed in this version.001800150528CREATE INDEX &LIB/
001900150528 IMPAR02BL1
FROM
term of select SQL queries on several lines have been fixed.#
as valid.Cards can be linked.
CLIST (script language to execute TSO/E commands) are now supported by the classification.
We added the support of new statements both of zOS and AS400 languages.
CLIST files can refer to other CLIST files using a statement like NAME1.NAME2.CLIST(MY_PROG)
where NAME1
and NAME2
are any valid names.
The INCLUDE statement (to build relationships between CTL and Programs) is now supported based on these two formats:
INCLUDE PROGRAM1
INCLUDE LIBRARY(PROGRAM1)
Message format service (MFS) is a part of the IBM Information Management System (IMS). This service receives messages from IMS, formats and sends it back to IMS. The dependencies analysis can now identify the message identifier in MFS files and detect their usage in Cobol files. This will create a link between Cobol file and the MFS file that defines the message, CEM01O
in the following example.CEM01O
message declaration in a MFS file
CEM01O MSG TYPE=OUTPUT,SOR=(CEM01,IGNORE),NXT=CEM01I
Use of CEM01O
message in Cobol file
CALL 'CBLTDLI' USING
IM-PARM-COUNT
IM-CALL-FUNC IO-PCB
TP-SEGMENT
'CEM01O'.
Many JCL statements can address CLIST files by prefixing their name by %
. For instance %MY_CLIST
refers to a CLIST file named MY_CLIST
in the project.
EXEC PGM=IKJEFT01 PARM='%MY_CLIST'
In JCL, there is a command named ISPSTART
that is used to reference a program. The dependencies analysis now handles this statement. For example:
ISPSTART CMD(MY_PROG)
In JCL, we can execute a file, like MY_PROG
, with statements like NAME1.NAME2.EXEC(MY_PROG)
. The NAME1
and NAME2
parts are any valid names. Statements like NAME1.EXEC(MY_PROG)
are also valid. The dependencies analysis now handles such statements.
COPY
statement is now supported to create links between PL1 programs and Copybook. The below statement establish relationship with Copybook CPY1
from the PL1 file.
COPY CPY1
Before, dependency analysis handle the INCLUDE
statement with keyword INCLUDE
and follows by a Copybook identifier. Now, it also handles the library name, either SYSLIB
or SFSTDS
, in between INCLUDE
keyword and the identifier. Example for the INCLUDE
statement with library name as follows.
% INCLUDE SYSLIB (CPY1 )
Free-form code is an evolution of RPG where statements can start at any column. RPG files starting with **FREE
are full free-form programs. Now, the dependencies analysis handles those files. RPG dependencies analysis is already detecting free-form section between /FREE
and /END-FREE
instructions, but is only focusing on SQL requests in these sections.
/Free
Exec SQL Update USER
Set LAST-NAME = "DOE"
Where NAME = "JANE";
/End-Free
RPG statements /COPY
, /INCLUDE
and EXEC SQL
are handled in free-form code like in a fixed-form RPG program, i.e. classic RPG or RPGLE code. The dependencies analysis handles declaration statements using an external data definition file, printer or program. Those statements are data structure declaration DCL-DS
, file declaration DCL-F
and prototype declaration DCL-PR
,
dcl-ds local-alias extname('dds-file')
dcl-f printer-file printer
dcl-pr prototype-name EXTNAME('external-program')
According to the service logs, none of the detected vulnerabilities have been exploited prior to the Bug Bounty Program.
In JCL, statement parameters are not ordered. Those two following examples are equivalent.
EXEC PROC=procedure_file ... NAME=program, PSB=psb_name
EXEC PROC=procedure_file ... PSB=psb_name, NAME=program
The JCL analysis has been improved to take that into account for EXEC
statements.
Some Cobol keywords in lower case, like the following example, generated false positive values for dynamic calls.
move '27' to W01returnCode
False positive Missing Program has been prevented for the SQL statements with EXEC keyword.
Improved the interpretation of the PL1 procedure (PROC) definition to avoid the false positive Missing Program
In RPG we can declare a prototype linked to an external program. Until now, the dependencies analysis was only considering the usage of the declared prototypes. In some cases if the prototype wasn't used the dependencies analysis wouldn't create a link to the referenced program. The dependencies analysis now directly creates a dependency to the external program at the prototype declaration. In the example below we declare a prototype named myprototype
that can be used later in the RPG file without creating new dependency. With the prototype declaration, we create a dependency to a program named MY_PROG
in the project.
D myprototype PR EXTPGM('MY_PROG')
On a Mainframe, SQL files can have margins, like the following example. Now, dependencies analysis ignore margin while processing it.
000000000 CREATE TABLE
000000000 TABLE1;
“I very frequently get the question: ‘What’s going to change in the next 10 years?’ And that is a very interesting question; it’s a very common one. I almost never get the question: ‘What’s not going to change in the next 10 years?’ And I submit to you that second question is actually the more important of the two — because you can build a business strategy around the things that are stable in time […] When you have something that you know is true, even over the long term, you can afford to put a lot of energy into it.”
― Jeff Bezos
This advice from Jeff is one pillar of the design of AWS Blu Insights. In the mainframe modernization business, there are a set of obvious things that will never change both for users (e.g. sales & delivery teams) and customers (end-users contracting the modernization). For example: increase prospects & customers trust, getting back to prospects within hours after receiving the source code, reducing the level of expertise required by non-technical or non-expert stakeholders (i.e. usually decisions makers on customers side), providing a sharp follow-up on project execution, sharing and collaborating transparently, building links among all the tools, identifying potential risks upfront, etc. We remain focus on our mission by delivering new features to address things that will never change. Although we are using new techniques and technologies to address them. Working backward as usual! This new release is full of novelties: Automatic graph decomposition, Potential extra links, graph operations, BI Builder, Duplicated program ids, Workspace in TC, Codebase from S3, etc. See details below.
For users who have been using AWS Blu Insights for quite some time, getting a ready-for-assessment Codebase project can become a mechanical and time-consuming task with no added value. Most of the projects require the same steps to get the Transformation Center project up and running with no extra fine-tuning as AWS Blu Insights’ components get more mature. And this is exactly why we built BI Builder, i.e. jumpstart AWS Blu Insights projects and give you back those precious minutes (sometimes hours) to focus on the things that matter more. The Builder automates all the process by setting up your Codebase project, launching the required analysis, creating the Transformation Center Project and launching the first Velocity Run to get the first outputs. Blu Insights will notify you when each step is done.
Automatic constraints-based subgraph extraction: Split your large graphs into multiple subgraphs by simply tweaking some configuration knobs: Pick the number of subgraphs to be generated, if needed, add other entry points that will be used to traverse the graph, specify maximum constraints (e.g. number of files, number of effective lines of code) that will be enforced (i.e. not exceeded) on all the generated subgraphs and if needed, specify a file type distribution to target. Subgraphs with similar types of distribution will be favored during the generation. Refer to the dedicated blog article to delve deeper into the vision and motivation behind this change, and to the accompanying documentation for a comprehensive and practical guide on leveraging this feature.
Supercharge your graphs with custom artefacts: Users can customize the presented artefacts of the graph by editing their types and colors. They can also hide and show nodes in their subgraphs. For expert users, they can download the results as a JSON file, rework it and upload it again. Usually the goal is to add or remove some vertices and/or edges to get the graph that most fits the customer’s needs or the project methodology. Now, it is possible to add, customize, and remove edges and vertices within a few clicks.
Guided-enrichment using cross references: Enrich your graphs with potential undetected dependencies. When you launch a dependency analysis, a specific engine supporting hundreds of statements run to detect relationships. Sometimes it may miss a few links (e.g. new statement, new language, etc.) which may lead to an incomplete result. This new feature will help you remove these gaps based on filtered propositions using cross references between files.
We added the support of new languages and statements based on actual use cases shared by the delivery teams.
Dependency analysis can now handle the Easytrieve (EZT) language. In particular, when an Easytrieve macro (MACRO) named macro-file is used in Easytrieve source file, like the following example:
%<macro-file>
Besides other programs, dependency analysis supports control card (CTL) files linkage for the system utility scenario. In the following example, JCL establish the dependency link with CTL named CTL_FILE and PSB named PSB_FILE file.
EXEC PGM=DFSRRC00, PARM=(DLI, CTL_FILE, PSB_FILE)
Dependency analysis can now handle ADDRQ, DEMAND and DEMANDH statements, and create the dependency link between the JCL files and the job in the parameter of these statements. In the following example, the dependencies are FIRST_FILE and SECOND_FILE in the first statement and the dependency is FIRST_FILE in the second one.
ADDRQ,JOB=FIRST_FILE,DEPJOB=SECOND_FILE
DEMAND,JOB=FIRST_FILE
In ASM and Cobol files, it is possible to define an ENTRY statement. This statement is an entry point to a program. The dependencies analysis can now detect and link such calls.
COPY statements among MFS files are now detected.
SQLC files are similar to SQL but with extra margins. The dependency analysis can now handle this type of files and detect their dependencies.
New billing mechanism: The Transformation Center service pricing has evolved to address customers' needs and expectations. The key changes are:
More details can be found in the documentation and FAQ.
Weather report: This report allows to see at a glance the compliance of the inputs and estimate the efforts required to modernize the related codebase. The option Only Weather Report in the Velocity pop-up allows you to get as many weather reports as you would like at no cost.
Workspace: We can now visualize transformation outputs directly in Blu Insights and benefits from all its IDE-like features.
Create codebase projects from S3 bucket: This feature allows users to create codebase projects from an S3 bucket by granting Blu Insights temporary access to their source code through the S3 presigned URL. The project creation is handled in the following steps: First, the user creates a presigned URL from the source code file hosted in an S3 bucket. Then, they pass the URL to Blu Insights. This latter uploads the source code based on the provided URL to its infrastructure and then creates the project.
Duplicated Cobol program identifiers detection: Some Cobol files with different names may have the same PROGRAM-ID. Getting the list of those files during the assessment is important in order to help understand the dependencies among all programs. Blu Insights offers a new feature to automatically extract this information. You can find this tool in Assets > Statistics in the … menu. The goal of this tool is to group all files that have the same Program ID in order to let the user compare and replace the files that do not have a unique Program ID.
Automatic Impacts analysis: This new module lets you see at a glance all workpackages, test scenarios, labels, statuses and team members that are impacted by the code refresh. The Impacts module goes through all the artefacts (workpackages, test scenarios, statuses, labels and team members) of the reference project, and checks if they are linked to a file that is modified or deleted in the refreshed project. Each artifact category has its own tab. The impacted artifacts are listed in their tab, along with the files impacting them.
Board Notes: We added a new notes section to let you keep your project notes in one private and unique place.
In TN5250, the connection phase is separated from the terminal screen. In case of a connection failure, Errors like server/socket issue or unavailability of the workstationId will be displayed on the form page.
We updated the estimates service to align with the Transformation Center pricing model and the new SOW template.
AMLABEL DBD NAME=AMPROGRAM,ACCESS=(HIDAM,VSAM)
DCL HELP_TEXT_TAB_01(170) CHAR(110) INIT('CALL PROGRAM')
One of our favorite Leadership Principle is “Insist on Highest Standards”. We apply it in our work methodology, our testing strategy, development workflow, deployment strategy, etc. It is actionable and the outcomes are easy to identify (i.e. less bugs and more satisfied customers). For example, we maintain over 2000 test cases covering the most used UI features, the dependencies statements, the classification, the security, the performance, etc. This new release was fully dedicated to quality, i.e. we updated over 200 test cases and mainly improved existing features (e.g. UI fixes, dependencies results and Capture&Replay fine tuning).
We are receiving more and more feedbacks from all of you. Please continue doing so, we are happy to help and improve AWS Blu Insights working backward from your needs.
All the details about this new release are below and the documentation has been updated accordingly. Enjoy!
REWRITE SUBFILE STUDENT-SUBFILE
label PCB TYPE=DB DBDNAME=DBD_FILE
EXEC DLIBMP,NAME=COBOL_PROGRAM,PSB=PSBFILE
We improved the performances of files deletion in Codebase projects (going from 60+ minutes to 30 seconds for 2000 files)!
IF COND(&VAR *EQ '1') THEN(CALL PGM(PGM1))
A B1KID2 R REFFLD(FLD) COLHDG('HDG1')
Legacy login and registration pages (using the MFA) are now deprecated, and the user is redirected to the Single Sign-On pages.
You can now add a description for your profile. It will help you identify at a glance which account you are working on. This description is displayed under your name at the top right of the application.
This new version of AWS Blu Insights embeds, as usual, new features and improvements, but it also brings major changes such as Single Sign-On (SSO) and the Transformation Center billing.
Single Sign-On
We are excited to share that AWS Blu Insights has added Single Sign-On (SSO) capabilities to improve the security and simplify navigation from the parent service (Mainframe Modernization). AWS Blu Insights access will be possible directly from the AWS Console. Registration on https://bluinsights.aws/ will no longer be required (and will later be deprecated).
We frequently heard from customers that they wanted to use their AWS accounts when working on AWS Blu Age refactoring projects. With this new feature, customers can now safely manage their accounts, leveraging AWS authentication mechanisms. The migration of the legacy accounts is ongoing.
Billing of the transformation
The usage of the Transformation Center is now charged following the pricing model described here. We describe all the details on how it works here.
The design, implementation and deployment of these two features required a thorough security validation (through a PenTest) in collaboration with AppSec teams. The outcome is a more secured AWS Blu Insights for the benefits of our customers.
More resources (Your inputs are welcome)
The team added over 100 new entries to the FAQ dealing with the different features, security, etc. We also added a new Resources section which will contain more details about major updates, useful links, usage examples, etc. We invite all of you to help us enrich this section. Please share your contributions and lessons learned.
MOVE "CALL PGM(MY_PROG)" TO MY_VAR
CALL ANOTHER_FILE USING MY_VAR
A R DSPF_RECORD
COPY DDS-DSPF_RECORD-O OF DSPF_FILE
The team is excited to see how hundreds of users are adopting the solution to modernize legacy applications and move them to the Cloud. We have an ambitious plan for this year and the first outcomes are already here in this release full of improvements and new features. You may also noticed that AWS Blu Insights moved from bluinsights.aws to bluinsights.aws!
Codebase
Capture & Replay
Dashboards
Transformation Center
We have made most of the improvements to speed up Runs launching and results display.
Codebase
Dependencies
Classification
Secured Spaces
Capture & Replay
System Utilities
Misc
Website & Documentation
Codebase
Transformation Center
Capture & Replay
Misc
Transformation Center
Capture & Replay
Dependencies
Misc
Capture & Replay
Transformation Center
Dependencies
Misc
Capture & Replay
Codebase
Transformation Center
Dependencies
MiscSEmail update with invalid email breaks the Profile.
This is one of the major versions of AWS Blu Insights! Most of the work is behind the scene but it brings more stability, better performances, improved User eXperience and much more. Enjoy!
Dependencies
Performances
This new version of AWS Blu Insights brings multiple performance improvements including:
Codebase
Terminals
Transformation Center
Misc
Codebase
TN5250
Transformation Center
Misc
Codebase
Transformation Center
Improvements
Transformation Center
Capture & Replay
Codebase
Bug fixes
Transformation Center:
Capture & Replay (TN5250):
Dependencies:
Documentation: The website (and documentation) has been updated with all the novelties.
It’s the summer, our team is taking a few days to relax, take some rest and come back later full of energy to deliver new features. During this batch, we mainly focus on improvements, documentation updates and a lot of non-visible work (test automations, technical updates, security checks, etc.). Here are the main achievements.
Transformation Center
Capture & Replay
Codebase
Improvements
Transformation Center
Dependencies
Transformation Center:
Estimates
Secured Space
Codebase
To-Dos
Various
Documentation: The website (and documentation) has been updated with all the novelties.
Documentation: The website (and documentation) has been updated with all the novelties.