Welcome!

Related Topics: PowerBuilder

PowerBuilder: Article

Harvest R7 – R12: Performing an Upgrade Process

A checklist of activities to be aware of when performing the upgrade process

This article will focus on the upgrade process from AllFusion Harvest Change Manager R7.1 to CA Software Change Manager (SCM) R12.0.2. I am writing this article because I recently went through this process and felt it would be beneficial to share this experience with other users in the field. We ran into some surprises and I wanted other users to be able to use this article as a checklist of activities to be aware of when performing this upgrade process.

This process can be pretty straightforward if the upgrade only concerns upgrading the repository data alone. However, there is a lot of overhead in this process if you are using Forms, Attachments to Forms, User Defined Processes (UDPs), Email notifications, Encryption of passwords, Enforce Package Bind Flags, Verify Package Dependency Flags, Canned Reports or Customized reports. The project that I worked on to upgrade used many of these processes, which made the upgrade progress more labor-intensive as a result. The team that was experiencing this upgrade used Harvest very aggressively and took advantage of the power and robustness of this tool, which in turn enhanced the information available to the team related to this project in a central repository.

Presently, I am the administrator of SCM AllFusion Harvest R12. SCM AllFusion Harvest is a process-based Software Configuration Management (SCM) tool for managing application source data assets. I manage 198 applications housed in SCM AllFusion Harvest and support 163 users using the product. The development tools we currently use in our development environment are PowerBuilder PBV8, PBV11; Visual Studio 2003, 2005, 2008; Visual Basic V6.0, IBM RAD V7.5.5 and Eclipse Europa.

As the Software Configuration Manager (SCM), I provide the administration of the source code management tool. This includes creating the Harvest environments, developing life cycles, environment phases, processes, users, user groups, level of access to Harvest environments, loading repositories for archival purposes, documentation of software methodologies; maintaining build machines; providing best practices and training all users on proper source code management using the development tools in our environment.

Every Software Configuration Management tool is different in terms of functionality and navigation; however, they all have common threads of functionality that are consistent with all Software Configuration Management tools. Common activities include checking out, checking in, adding new files, deleting existing files, obsoleting files and loading an initial baseline of source code. The way these tasks are achieved is of course different from tool to tool but many of the SCM tools perform these basic activities. I prefer SCM tools that have a relational database behind them for security, disaster recovery, retrieval and storage capability.

One of the first big differences that I noticed right up front was that the form is now stored as a table in the Oracle database in SCM 12, whereas in Harvest 7.1 it was stored on a shared drive or some other location where users could set the path and point to it from within the Harvest Workbench. Figures 1 and 2 illustrate the process by which you set the path for where your form will be located to allow users to point to this location to view the form from within Harvest 7.1. In SCM 12 the .hfd is used and the formgen.exe is run against the .hfd to generate a .sql, .htm and .xml file and then the hformsync command is run against the .hfd and the .xml content is then uploaded in a table in the SCM 12 database.

In this case we are using Oracle 11g. This is a nice change from AllFusion Harvest Change Manager R7.1. This process safeguards the form from changes on some shared drive location and it provides efficiency to the form for retrieval and storage of the data within it now that it is stored in a database table. Now there is no need to have a shared drive location and the form files populated there for users to point to. This also precludes changes to the .htm or .hfd and circumventing the process by which the data is synched up in the database. In the past, with the form files located in a place where folks can get to it, there is the temptation to change these files on the fly to reflect certain pieces of data without that data getting into all the form files; you can run into sync issues later on with regard to the form and an upgrade of the product. This new process takes away that temptation and stores the data in the database.

When we did this upgrade a new server was purchased to house a new installation of SCM R12. We installed the SCM R12 server software, the SCM R12 Client and Oracle 11G client on this server and pointed to a Unix platform. Once we created our OBDC connection and created the new schema, we used a database export from our AllFusion Harvest Change Manager 7.1 server to import the data into the new SCM R12 database.

Before we ran the Hdbsetup.exe command I ran the Hsysreport and Hdbanalyze commands, which provides information about the database and ensures that there are no errors existing before we run the Hdbsetup.exe. You want to make sure that you are importing error-free data into an error-free database (Garbage in Garbage out). Once this was complete we ran the CA Hdbsetup.exe to upgrade the instance and data from Harvest 7.1 to SCM R12 using the commands: Upgrade SCM Repository (UR), Load Projects (LP) and Load Forms (LF). These commands will upgrade all repository data, project data and form data with the latest update of whatever export date you acquire.

Once these actions were complete, I then ran the Formgen.exe against the latest .hfd file, which created the latest .sql, .htm and .xml files. I then created a directory at the root of the data drive on the server simply called "Forms" and I copied all four of the form files to this location. I had to do this because I had a difficult time navigating to the SCM home directory. This is a Windows 2008 server and there were some unique issues related to this operating system that had to be overcome. Once the forms (Test Form.hfd, Test Form.htm, Test Form.sql and Test Form.xml) were populated to the new directory at the root of the data drive, I then ran CA's Hformsync command in that directory. Below is the command that I used to sync up the form and the data to the database:

Example:  hformsync -b "server name" -usr "xxxxxx -pw xxxxxx" -d "e:\forms" -hfd -f "name of form.hfd"

When this runs successfully the following output message is generated to confirm whether the Hformsync was run successfully or had any failure(s) associated with it.

Example:  I00060040: New connection with Broker "xxxxxxxxxx"  established.
Problem Report form : processed sucessfully.
Number of Files Updated in DB:1
Form synchronization has been completed successfully.

The form in SCM 12 has new navigational processes that are different from Harvest 7.1. For example, the attachment to the form is now added directly under the form in the tree view rather than as it was in Harvest 7.1 - a small paper clip was at the bottom of the form and you could attach files to the form this way. Now you do a right-mouse click on the form and you can add a file to the form this way. There is a details report now of the form that is very useful and can be printed. Also, you can now open up multiple forms at once from the find form view by clicking on the edit form and then close the find form view and the forms are all available to view; if you click on the (X) button on the top of the form you can then close each one as they are viewed. Figure 4 illustrates the paper clip in the bottom of the form and Figure 5 illustrates the right-mouse click commands that are now available on the form in SCM R12.

Figure 6 illustrates the form search. Once the forms have been located you can use the (Ctrl) key on your keyboard to select the form(s) that you want to view/edit, do a right-mouse click and select (Edit Form), and the forms will individually get populated on the right-hand pane of the SCM Workbench for viewing. You will have to close the Find Form search screen to view the forms which are located behind the search screen. Figure 7 illustrates on the top of the menu bar the forms that are open for viewing. When you have completed reviewing one form you can click the (X) to close the form and begin viewing the next form.

When we perform the database export it takes approximately one hour to complete. We shut down the Harvest brokers while this process is being done. We typically do this process in the evening without impacting users during the day. This is to make sure that no one is accessing the database while the export is being conducted. We have found that running an export during the day slows our network performance down greatly and impacts the user's ability to perform Harvest SCM activities. When a full export is available, it's then sneaker netted to the location where we want to use the export to import it into the instance of the SCM 12 database.

Once the commands have all been run and the data has been successfully updated, it's now time to test to ensure that the data that came over as part of this upgrade process is accurate and that all processes work as expected; all users can log into SCM R12 Workbench; repository data is accessible and accurate; form data is available and accurate; form attachments are accurate and can be viewed; User Defined Processes (UDPs) are available and accurate; Package Binding and Dependency flags are in place and functional; promotions and demotions are working in all Harvest States; and perform test checkouts and check-ins to ensure data can still be processed.

Once the verification testing is complete and you're confident of the accuracy of all other processes and data assets, it's now time to update your build script and perform a build and produce a solid executable. By solid executable I mean a verifiable build that reflects production in the field. The source data assets are accessed via the Harvest HCO command and data is acquired via the new Harvest server repositories.

When we began this process of updating and running our Ant script we were unaware that encryption was being used during the build process. This was discovered during the running of our Ant script after we attempted to acquire browse read-only source data assets from the new SCM R12 Harvest server, Harvest environment, Harvest State and Harvest repository and it produced the following error message:

The error is produced when using the -eh command option using the hco command in the Ant build script.

HCO command we use:  hco -b "server name" -usr "xxxxxxxx" -pw "xxxxxxxx" -vp \Test_Code_Project -en Test_Code_Project -st "ACCEPTANCE TEST" -pn "CHECK OUT FOR BROWSE/SYNCHRONIZE" -cp c:\Test_Project\ -br -r -op pc -s "*.*" -o output.log

ERROR: Please encrypt the password file..\lib\xxxxxx.txt with new svrenc utility

Refer to page 161 of the CA software Change Manager Command Line Reference Guide.  See error above that refers to the NEW svrenc utility.  When this error is received you need to run the below command to create a new encrypted user/password file.  The name of the new encrypted file needs to be xxxxxxxx.txt.

svrenc Command-Encrypt User and Password Credentials to a File

The svrenc command is a user name and password encryption utility that stores encrypted credentials in a file, which can then be used by the:

  1. CA SCM Server to obtain the database user and password
  2. CA SCM Remote Agent (LDAP support)
  3. CA SCM command line utilities to obtain the CA SCM or remote computer user and password

This command has the following format:

svrenc {-s | -f filename} [-usr  username] [-pw password] [-dir  directory_name] [-o filename | -oa  filename] [-arg] [-wts] [-h]

-s   (Required: -s and -f are mutually exclusive and one is required.)  Specifies that the encrypted credentials are saved in a hidden file named:  hsvr.dfo in <CA_SCM_HOME>; which is then utilized by the CA SCM Server processes when connecting to the Database Server.

-f filename (Required: -s and -f are mutually exclusive and one is required.)  Specifies that an encryption file be created with the file name you provide.

If -f filename is specified, the encrypted credentials are saved in a hidden file named: filename, which can then be utilized by Remote Agent LDAP support or command-line utilities.

Once we ran the following command, the newly encrypted file was saved on our build machine and the Ant script built successfully as expected.

Example:  svrenc -f password.txt -usr xxxxxxx -pw xxxxxxxx -dir C:\Build\scripts

This encrypted file did not exist on our new build machine and had to be re-created using the command above. When it ran it did create the new encrypted file, which contained the encrypted user name and password to run the Ant build script.

When we perform back-ups or database exports on the Harvest server, we shut down all the services. This includes the Harvest Brokers, Oracle Database and Apache Tomcat. We developed automated scripts that run on a schedule every evening and perform the shut of the services, and when the backups and database export is complete the automated scripts bring all the services back up as well. We updated these automated scripts for the new Harvest server and tested them to ensure that the backups and exports were running successfully.

The export that we used to import the projects from the Harvest 7.1 server contains Harvest environments that are not required on the new SCM R12 server. These projects take up resources (RAM and Hard Drive Space) that are unnecessary on this server. CA provided us with an Hsysdelete command that allows the deletion of project/repository data from the Harvest SCM R12 database. Below is the script used and provided by CA to accomplish the deletion task of unwanted Harvest environments:

hsysdelete -n "xxxx" -u "xxxxxx" -p XXXXXXXXX "Repository Name"

One of the last tasks of this upgrade was to install the Business Objects application to make the 44 canned reports that come with SCM R12 available. Managers and users will find these reports very useful for reporting on SCM activities by user(s) or projects.

Our Experience
We develop and maintain more than 198 applications at New Hampshire's Department of Information Technology. The applications are used extensively in our welfare and health services delivery agencies. Example applications are for child-care licensing and managing adult and elderly care. Throughout the state the applications are used by hundreds of users.

My synopsis and review of the SCM R12 upgrade process goes as follows:

As I've stated earlier this process is very straightforward if all that is involved is the updating of repository and project data. However, there is a lot of overhead in this process if you are using Forms, Attachments to Forms, User Defined Processes (UDPs), Email notifications, Encryption of passwords, Enforce Package Bind Flags, Verify Package Dependency Flags, Canned Reports or Customized reports. The project that I worked on to upgrade used many of these processes, which made the upgrade more labor-intensive as a result. The team that was experiencing this upgrade used Harvest very aggressively and took advantage of the power and robustness of this tool, which in turn enhanced the information available to the team related to this project in a central repository. Again though there was a lot more to look at and attend to in terms of making this upgrade go off successfully.
The more moving parts the more possibility for issues to arise. The good thing is that CA has a great technical team that is very helpful. When issues arise, they have the resources to tackle them and help you to be successful with your implementation.

Please feel free to contact me should you have any questions regarding the product (CA SCM AllFusion Harvest) and its use in our environment with various development tools.

More Stories By Al Soucy

Al Soucy is software configuration manager at the State of New Hampshire's Department of Information Technology (DoIT). In that role Al manages software configuration for dozens of PowerBuilder applications as well as applications written in Java, .NET, and COBOL (yes, COBOL). Al plays bass guitar, acoustic guitar, electric rhythm/lead guitar, drums, mandolin, keyboard; he sings lead and back up vocals and he has released 8 CDs.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Eric Robertson, General Manager at CollabNet, will discuss how customers are able to achieve a level of transparency that e...
It is ironic, but perhaps not unexpected, that many organizations who want the benefits of using an Agile approach to deliver software use a waterfall approach to adopting Agile practices: they form plans, they set milestones, and they measure progress by how many teams they have engaged. Old habits die hard, but like most waterfall software projects, most waterfall-style Agile adoption efforts fail to produce the results desired. The problem is that to get the results they want, they have to ch...
"Venafi has a platform that allows you to manage, centralize and automate the complete life cycle of keys and certificates within the organization," explained Gina Osmond, Sr. Field Marketing Manager at Venafi, in this SYS-CON.tv interview at DevOps at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
"We're focused on how to get some of the attributes that you would expect from an Amazon, Azure, Google, and doing that on-prem. We believe today that you can actually get those types of things done with certain architectures available in the market today," explained Steve Conner, VP of Sales at Cloudistics, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
For far too long technology teams have lived in siloes. Not only physical siloes, but cultural siloes pushed by competing objectives. This includes informational siloes where business users require one set of data and tech teams require different data. DevOps intends to bridge these gaps to make tech driven operations more aligned and efficient.
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service.
DXWorldEXPO LLC announced today that the upcoming DXWorldEXPO | CloudEXPO New York event will feature 10 companies from Poland to participate at the "Poland Digital Transformation Pavilion" on November 12-13, 2018.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
The best way to leverage your CloudEXPO | DXWorldEXPO presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering CloudEXPO | DXWorldEXPO will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at CloudEXPO. Product announcements during our show provide your company with the most reach through our targeted audienc...
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors!
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smart...
DXWorldEXPO LLC announced today that All in Mobile, a mobile app development company from Poland, will exhibit at the 22nd International CloudEXPO | DXWorldEXPO. All In Mobile is a mobile app development company from Poland. Since 2014, they maintain passion for developing mobile applications for enterprises and startups worldwide.