Welcome!

Related Topics: PowerBuilder

PowerBuilder: Article

Harvest R7 – R12: Performing an Upgrade Process

A checklist of activities to be aware of when performing the upgrade process

This article will focus on the upgrade process from AllFusion Harvest Change Manager R7.1 to CA Software Change Manager (SCM) R12.0.2. I am writing this article because I recently went through this process and felt it would be beneficial to share this experience with other users in the field. We ran into some surprises and I wanted other users to be able to use this article as a checklist of activities to be aware of when performing this upgrade process.

This process can be pretty straightforward if the upgrade only concerns upgrading the repository data alone. However, there is a lot of overhead in this process if you are using Forms, Attachments to Forms, User Defined Processes (UDPs), Email notifications, Encryption of passwords, Enforce Package Bind Flags, Verify Package Dependency Flags, Canned Reports or Customized reports. The project that I worked on to upgrade used many of these processes, which made the upgrade progress more labor-intensive as a result. The team that was experiencing this upgrade used Harvest very aggressively and took advantage of the power and robustness of this tool, which in turn enhanced the information available to the team related to this project in a central repository.

Presently, I am the administrator of SCM AllFusion Harvest R12. SCM AllFusion Harvest is a process-based Software Configuration Management (SCM) tool for managing application source data assets. I manage 198 applications housed in SCM AllFusion Harvest and support 163 users using the product. The development tools we currently use in our development environment are PowerBuilder PBV8, PBV11; Visual Studio 2003, 2005, 2008; Visual Basic V6.0, IBM RAD V7.5.5 and Eclipse Europa.

As the Software Configuration Manager (SCM), I provide the administration of the source code management tool. This includes creating the Harvest environments, developing life cycles, environment phases, processes, users, user groups, level of access to Harvest environments, loading repositories for archival purposes, documentation of software methodologies; maintaining build machines; providing best practices and training all users on proper source code management using the development tools in our environment.

Every Software Configuration Management tool is different in terms of functionality and navigation; however, they all have common threads of functionality that are consistent with all Software Configuration Management tools. Common activities include checking out, checking in, adding new files, deleting existing files, obsoleting files and loading an initial baseline of source code. The way these tasks are achieved is of course different from tool to tool but many of the SCM tools perform these basic activities. I prefer SCM tools that have a relational database behind them for security, disaster recovery, retrieval and storage capability.

One of the first big differences that I noticed right up front was that the form is now stored as a table in the Oracle database in SCM 12, whereas in Harvest 7.1 it was stored on a shared drive or some other location where users could set the path and point to it from within the Harvest Workbench. Figures 1 and 2 illustrate the process by which you set the path for where your form will be located to allow users to point to this location to view the form from within Harvest 7.1. In SCM 12 the .hfd is used and the formgen.exe is run against the .hfd to generate a .sql, .htm and .xml file and then the hformsync command is run against the .hfd and the .xml content is then uploaded in a table in the SCM 12 database.

In this case we are using Oracle 11g. This is a nice change from AllFusion Harvest Change Manager R7.1. This process safeguards the form from changes on some shared drive location and it provides efficiency to the form for retrieval and storage of the data within it now that it is stored in a database table. Now there is no need to have a shared drive location and the form files populated there for users to point to. This also precludes changes to the .htm or .hfd and circumventing the process by which the data is synched up in the database. In the past, with the form files located in a place where folks can get to it, there is the temptation to change these files on the fly to reflect certain pieces of data without that data getting into all the form files; you can run into sync issues later on with regard to the form and an upgrade of the product. This new process takes away that temptation and stores the data in the database.

When we did this upgrade a new server was purchased to house a new installation of SCM R12. We installed the SCM R12 server software, the SCM R12 Client and Oracle 11G client on this server and pointed to a Unix platform. Once we created our OBDC connection and created the new schema, we used a database export from our AllFusion Harvest Change Manager 7.1 server to import the data into the new SCM R12 database.

Before we ran the Hdbsetup.exe command I ran the Hsysreport and Hdbanalyze commands, which provides information about the database and ensures that there are no errors existing before we run the Hdbsetup.exe. You want to make sure that you are importing error-free data into an error-free database (Garbage in Garbage out). Once this was complete we ran the CA Hdbsetup.exe to upgrade the instance and data from Harvest 7.1 to SCM R12 using the commands: Upgrade SCM Repository (UR), Load Projects (LP) and Load Forms (LF). These commands will upgrade all repository data, project data and form data with the latest update of whatever export date you acquire.

Once these actions were complete, I then ran the Formgen.exe against the latest .hfd file, which created the latest .sql, .htm and .xml files. I then created a directory at the root of the data drive on the server simply called "Forms" and I copied all four of the form files to this location. I had to do this because I had a difficult time navigating to the SCM home directory. This is a Windows 2008 server and there were some unique issues related to this operating system that had to be overcome. Once the forms (Test Form.hfd, Test Form.htm, Test Form.sql and Test Form.xml) were populated to the new directory at the root of the data drive, I then ran CA's Hformsync command in that directory. Below is the command that I used to sync up the form and the data to the database:

Example:  hformsync -b "server name" -usr "xxxxxx -pw xxxxxx" -d "e:\forms" -hfd -f "name of form.hfd"

When this runs successfully the following output message is generated to confirm whether the Hformsync was run successfully or had any failure(s) associated with it.

Example:  I00060040: New connection with Broker "xxxxxxxxxx"  established.
Problem Report form : processed sucessfully.
Number of Files Updated in DB:1
Form synchronization has been completed successfully.

The form in SCM 12 has new navigational processes that are different from Harvest 7.1. For example, the attachment to the form is now added directly under the form in the tree view rather than as it was in Harvest 7.1 - a small paper clip was at the bottom of the form and you could attach files to the form this way. Now you do a right-mouse click on the form and you can add a file to the form this way. There is a details report now of the form that is very useful and can be printed. Also, you can now open up multiple forms at once from the find form view by clicking on the edit form and then close the find form view and the forms are all available to view; if you click on the (X) button on the top of the form you can then close each one as they are viewed. Figure 4 illustrates the paper clip in the bottom of the form and Figure 5 illustrates the right-mouse click commands that are now available on the form in SCM R12.

Figure 6 illustrates the form search. Once the forms have been located you can use the (Ctrl) key on your keyboard to select the form(s) that you want to view/edit, do a right-mouse click and select (Edit Form), and the forms will individually get populated on the right-hand pane of the SCM Workbench for viewing. You will have to close the Find Form search screen to view the forms which are located behind the search screen. Figure 7 illustrates on the top of the menu bar the forms that are open for viewing. When you have completed reviewing one form you can click the (X) to close the form and begin viewing the next form.

When we perform the database export it takes approximately one hour to complete. We shut down the Harvest brokers while this process is being done. We typically do this process in the evening without impacting users during the day. This is to make sure that no one is accessing the database while the export is being conducted. We have found that running an export during the day slows our network performance down greatly and impacts the user's ability to perform Harvest SCM activities. When a full export is available, it's then sneaker netted to the location where we want to use the export to import it into the instance of the SCM 12 database.

Once the commands have all been run and the data has been successfully updated, it's now time to test to ensure that the data that came over as part of this upgrade process is accurate and that all processes work as expected; all users can log into SCM R12 Workbench; repository data is accessible and accurate; form data is available and accurate; form attachments are accurate and can be viewed; User Defined Processes (UDPs) are available and accurate; Package Binding and Dependency flags are in place and functional; promotions and demotions are working in all Harvest States; and perform test checkouts and check-ins to ensure data can still be processed.

Once the verification testing is complete and you're confident of the accuracy of all other processes and data assets, it's now time to update your build script and perform a build and produce a solid executable. By solid executable I mean a verifiable build that reflects production in the field. The source data assets are accessed via the Harvest HCO command and data is acquired via the new Harvest server repositories.

When we began this process of updating and running our Ant script we were unaware that encryption was being used during the build process. This was discovered during the running of our Ant script after we attempted to acquire browse read-only source data assets from the new SCM R12 Harvest server, Harvest environment, Harvest State and Harvest repository and it produced the following error message:

The error is produced when using the -eh command option using the hco command in the Ant build script.

HCO command we use:  hco -b "server name" -usr "xxxxxxxx" -pw "xxxxxxxx" -vp \Test_Code_Project -en Test_Code_Project -st "ACCEPTANCE TEST" -pn "CHECK OUT FOR BROWSE/SYNCHRONIZE" -cp c:\Test_Project\ -br -r -op pc -s "*.*" -o output.log

ERROR: Please encrypt the password file..\lib\xxxxxx.txt with new svrenc utility

Refer to page 161 of the CA software Change Manager Command Line Reference Guide.  See error above that refers to the NEW svrenc utility.  When this error is received you need to run the below command to create a new encrypted user/password file.  The name of the new encrypted file needs to be xxxxxxxx.txt.

svrenc Command-Encrypt User and Password Credentials to a File

The svrenc command is a user name and password encryption utility that stores encrypted credentials in a file, which can then be used by the:

  1. CA SCM Server to obtain the database user and password
  2. CA SCM Remote Agent (LDAP support)
  3. CA SCM command line utilities to obtain the CA SCM or remote computer user and password

This command has the following format:

svrenc {-s | -f filename} [-usr  username] [-pw password] [-dir  directory_name] [-o filename | -oa  filename] [-arg] [-wts] [-h]

-s   (Required: -s and -f are mutually exclusive and one is required.)  Specifies that the encrypted credentials are saved in a hidden file named:  hsvr.dfo in <CA_SCM_HOME>; which is then utilized by the CA SCM Server processes when connecting to the Database Server.

-f filename (Required: -s and -f are mutually exclusive and one is required.)  Specifies that an encryption file be created with the file name you provide.

If -f filename is specified, the encrypted credentials are saved in a hidden file named: filename, which can then be utilized by Remote Agent LDAP support or command-line utilities.

Once we ran the following command, the newly encrypted file was saved on our build machine and the Ant script built successfully as expected.

Example:  svrenc -f password.txt -usr xxxxxxx -pw xxxxxxxx -dir C:\Build\scripts

This encrypted file did not exist on our new build machine and had to be re-created using the command above. When it ran it did create the new encrypted file, which contained the encrypted user name and password to run the Ant build script.

When we perform back-ups or database exports on the Harvest server, we shut down all the services. This includes the Harvest Brokers, Oracle Database and Apache Tomcat. We developed automated scripts that run on a schedule every evening and perform the shut of the services, and when the backups and database export is complete the automated scripts bring all the services back up as well. We updated these automated scripts for the new Harvest server and tested them to ensure that the backups and exports were running successfully.

The export that we used to import the projects from the Harvest 7.1 server contains Harvest environments that are not required on the new SCM R12 server. These projects take up resources (RAM and Hard Drive Space) that are unnecessary on this server. CA provided us with an Hsysdelete command that allows the deletion of project/repository data from the Harvest SCM R12 database. Below is the script used and provided by CA to accomplish the deletion task of unwanted Harvest environments:

hsysdelete -n "xxxx" -u "xxxxxx" -p XXXXXXXXX "Repository Name"

One of the last tasks of this upgrade was to install the Business Objects application to make the 44 canned reports that come with SCM R12 available. Managers and users will find these reports very useful for reporting on SCM activities by user(s) or projects.

Our Experience
We develop and maintain more than 198 applications at New Hampshire's Department of Information Technology. The applications are used extensively in our welfare and health services delivery agencies. Example applications are for child-care licensing and managing adult and elderly care. Throughout the state the applications are used by hundreds of users.

My synopsis and review of the SCM R12 upgrade process goes as follows:

As I've stated earlier this process is very straightforward if all that is involved is the updating of repository and project data. However, there is a lot of overhead in this process if you are using Forms, Attachments to Forms, User Defined Processes (UDPs), Email notifications, Encryption of passwords, Enforce Package Bind Flags, Verify Package Dependency Flags, Canned Reports or Customized reports. The project that I worked on to upgrade used many of these processes, which made the upgrade more labor-intensive as a result. The team that was experiencing this upgrade used Harvest very aggressively and took advantage of the power and robustness of this tool, which in turn enhanced the information available to the team related to this project in a central repository. Again though there was a lot more to look at and attend to in terms of making this upgrade go off successfully.
The more moving parts the more possibility for issues to arise. The good thing is that CA has a great technical team that is very helpful. When issues arise, they have the resources to tackle them and help you to be successful with your implementation.

Please feel free to contact me should you have any questions regarding the product (CA SCM AllFusion Harvest) and its use in our environment with various development tools.

More Stories By Al Soucy

Al Soucy is software configuration manager at the State of New Hampshire's Department of Information Technology (DoIT). In that role Al manages software configuration for dozens of PowerBuilder applications as well as applications written in Java, .NET, and COBOL (yes, COBOL). Al plays bass guitar, acoustic guitar, electric rhythm/lead guitar, drums, mandolin, keyboard; he sings lead and back up vocals and he has released 8 CDs.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...
Apache Hadoop is a key technology for gaining business insights from your Big Data, but the penetration into enterprises is shockingly low. In fact, Apache Hadoop and Big Data proponents recognize that this technology has not yet achieved its game-changing business potential. In his session at 19th Cloud Expo, John Mertic, director of program management for ODPi at The Linux Foundation, will explain why this is, how we can work together as an open data community to increase adoption, and the i...
If you had a chance to enter on the ground level of the largest e-commerce market in the world – would you? China is the world’s most populated country with the second largest economy and the world’s fastest growing market. It is estimated that by 2018 the Chinese market will be reaching over $30 billion in gaming revenue alone. Admittedly for a foreign company, doing business in China can be challenging. Often changing laws, administrative regulations and the often inscrutable Chinese Interne...
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, discussed how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved efficienc...
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
As ridesharing competitors and enhanced services increase, notable changes are occurring in the transportation model. Despite the cost-effective means and flexibility of ridesharing, both drivers and users will need to be aware of the connected environment and how it will impact the ridesharing experience. In his session at @ThingsExpo, Timothy Evavold, Executive Director Automotive at Covisint, will discuss key challenges and solutions to powering a ride sharing and/or multimodal model in the a...
DevOps at Cloud Expo – being held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's largest enterprises – and delivering real results. Am...
SYS-CON Events announced today that CDS Global Cloud, an Infrastructure as a Service provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. CDS Global Cloud is an IaaS (Infrastructure as a Service) provider specializing in solutions for e-commerce, internet gaming, online education and other internet applications. With a growing number of data centers and network points around the world, ...
SYS-CON Events announced today that LeaseWeb USA, a cloud Infrastructure-as-a-Service (IaaS) provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. LeaseWeb is one of the world's largest hosting brands. The company helps customers define, develop and deploy IT infrastructure tailored to their exact business needs, by combining various kinds cloud solutions.
Big Data has been changing the world. IoT fuels the further transformation recently. How are Big Data and IoT related? In his session at @BigDataExpo, Tony Shan, a renowned visionary and thought leader, will explore the interplay of Big Data and IoT. He will anatomize Big Data and IoT separately in terms of what, which, why, where, when, who, how and how much. He will then analyze the relationship between IoT and Big Data, specifically the drilldown of how the 4Vs of Big Data (Volume, Variety,...
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Much of the value of DevOps comes from a (renewed) focus on measurement, sharing, and continuous feedback loops. In increasingly complex DevOps workflows and environments, and especially in larger, regulated, or more crystallized organizations, these core concepts become even more critical. In his session at @DevOpsSummit at 18th Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, showed how, by focusing on 'metrics that matter,' you can provide objective, transparent, and meaningful f...