|By PR Newswire||
|March 19, 2014 10:00 AM EDT||
TORONTO, March 19, 2014 /CNW/ - While public debate has mainly focused on the "gold-plated" defined benefits of many public-service pension plans, the real problem lies in a flawed approach to managing compensation costs, according to respected pension expert Malcolm Hamilton. In, "Evaluating Public-Sector Pensions: How Much Do They Really Cost?" Hamilton says the problem is government sponsors who typically underestimate the cost of guaranteeing pay-outs in the future, leading to the undervaluation of employee pension costs and the mismanagement of employee compensation.
"The problem is not with the defined-benefit plans per se, it relates to the mispricing of their guarantees, which leads to the over-compensation of employees and badly accounted-for risks to future taxpayers," says Hamilton. "This is particularly true in the federal public sector where plan members are insulated from investment risk, as compared to the provincial public sectors where members frequently bear half of the risk, if not more."
In the first of a two-part series on government employee pensions, Hamilton observes that public-sector pension plans in Canada have many virtues. 'They are generally large, efficient and well managed. However, there are large differences between the fair values of the pensions earned by public-sector employees and the "cost" of these pensions according to public-sector financial statements,' states Hamilton.
These differences arise almost entirely from the pricing of guarantees. Specifically, the financial markets attach high values to the guarantees embedded in public-sector pension plans while government financial statements attach little or no value to these guarantees. This means that pension costs are materially understated and, as a consequence:
- employees in the public sector are paid more than is publicly acknowledged and, in many instances, more than their private-sector counterparts;
- public-sector employees shelter more of their retirement savings from tax than other Canadians are permitted to shelter; and
taxpayers bear much of the investment risk taken by public-sector
pension plans while the reward for risk-taking goes to public employees
as higher compensation.
Hamilton notes that private-sector pension accounting standards long ago rejected the premise at the heart of today's public-sector accounting standards - that the cost of a fully guaranteed pension depends critically upon the rates of return that a pension fund can earn on risky investments even though the pension itself is totally unaffected by these returns.
Public-sector accounting practice books the returns that a pension fund might reasonably expect to earn as a reward for future risk taking long before the risks are taken, and this risk premium is used to reduce the reported cost of employee pensions. As a consequence, in a traditional defined benefit pension plan where pensions are guaranteed and employee contributions are fixed, the reward for future risk-taking goes to employees who, because their pensions are fully guaranteed, take no risk. Future taxpayers, on the other hand, will be expected to bear risk without fair compensation.
"Essentially, we have devised a complicated way to transfer wealth from future taxpayers to current plan members," notes Hamilton.
The good news, he says, is that once the accounting problem is recognized for what it is, the solution becomes obvious. "The risks that taxpayers are being asked to bear without compensation should be transferred, in whole or in part, to the plan members on whose behalf these risks are being taken."
This can be accomplished in a variety of ways, says Hamilton. Benefits can be tied to funding levels and/or to the performance of pension funds. Employee contributions and/or salaries can be tied to the cost of funding their pensions. "Many provincial governments have already started to move in this direction," notes the author.
The C. D. Howe Institute is an independent not-for-profit research institute whose mission is to raise living standards by fostering economically sound public policies. It is Canada's trusted source of essential policy intelligence, distinguished by research that is nonpartisan, evidence-based and subject to definitive expert review. It is considered by many to be Canada's most influential think tank.
SOURCE C.D. Howe Institute
There are 66 million network cameras capturing terabytes of data. How did factories in Japan improve physical security at the facilities and improve employee productivity? Edge Computing reduces possible kilobytes of data collected per second to only a few kilobytes of data transmitted to the public cloud every day. Data is aggregated and analyzed close to sensors so only intelligent results need to be transmitted to the cloud. Non-essential data is recycled to optimize storage.
Mar. 28, 2017 08:15 AM EDT Reads: 3,141
"I think that everyone recognizes that for IoT to really realize its full potential and value that it is about creating ecosystems and marketplaces and that no single vendor is able to support what is required," explained Esmeralda Swartz, VP, Marketing Enterprise and Cloud at Ericsson, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Mar. 28, 2017 08:00 AM EDT Reads: 4,389
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" ...
Mar. 28, 2017 06:00 AM EDT Reads: 8,909
As businesses adopt functionalities in cloud computing, it’s imperative that IT operations consistently ensure cloud systems work correctly – all of the time, and to their best capabilities. In his session at @BigDataExpo, Bernd Harzog, CEO and founder of OpsDataStore, will present an industry answer to the common question, “Are you running IT operations as efficiently and as cost effectively as you need to?” He will expound on the industry issues he frequently came up against as an analyst, and...
Mar. 28, 2017 06:00 AM EDT Reads: 4,291
Why do your mobile transformations need to happen today? Mobile is the strategy that enterprise transformation centers on to drive customer engagement. In his general session at @ThingsExpo, Roger Woods, Director, Mobile Product & Strategy – Adobe Marketing Cloud, covered key IoT and mobile trends that are forcing mobile transformation, key components of a solid mobile strategy and explored how brands are effectively driving mobile change throughout the enterprise.
Mar. 28, 2017 06:00 AM EDT Reads: 2,957
After more than five years of DevOps, definitions are evolving, boundaries are expanding, ‘unicorns’ are no longer rare, enterprises are on board, and pundits are moving on. Can we now look at an evolution of DevOps? Should we? Is the foundation of DevOps ‘done’, or is there still too much left to do? What is mature, and what is still missing? What does the next 5 years of DevOps look like? In this Power Panel at DevOps Summit, moderated by DevOps Summit Conference Chair Andi Mann, panelists l...
Mar. 28, 2017 05:00 AM EDT Reads: 9,890
In their Live Hack” presentation at 17th Cloud Expo, Stephen Coty and Paul Fletcher, Chief Security Evangelists at Alert Logic, provided the audience with a chance to see a live demonstration of the common tools cyber attackers use to attack cloud and traditional IT systems. This “Live Hack” used open source attack tools that are free and available for download by anybody. Attendees learned where to find and how to operate these tools for the purpose of testing their own IT infrastructure. The...
Mar. 28, 2017 04:45 AM EDT Reads: 7,476
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor - all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
Mar. 28, 2017 02:30 AM EDT Reads: 2,039
My team embarked on building a data lake for our sales and marketing data to better understand customer journeys. This required building a hybrid data pipeline to connect our cloud CRM with the new Hadoop Data Lake. One challenge is that IT was not in a position to provide support until we proved value and marketing did not have the experience, so we embarked on the journey ourselves within the product marketing team for our line of business within Progress. In his session at @BigDataExpo, Sum...
Mar. 28, 2017 02:15 AM EDT Reads: 3,118
The modern software development landscape consists of best practices and tools that allow teams to deliver software in a near-continuous manner. By adopting a culture of automation, measurement and sharing, the time to ship code has been greatly reduced, allowing for shorter release cycles and quicker feedback from customers and users. Still, with all of these tools and methods, how can teams stay on top of what is taking place across their infrastructure and codebase? Hopping between services a...
Mar. 28, 2017 02:15 AM EDT Reads: 9,911
Virtualization over the past years has become a key strategy for IT to acquire multi-tenancy, increase utilization, develop elasticity and improve security. And virtual machines (VMs) are quickly becoming a main vehicle for developing and deploying applications. The introduction of containers seems to be bringing another and perhaps overlapped solution for achieving the same above-mentioned benefits. Are a container and a virtual machine fundamentally the same or different? And how? Is one techn...
Mar. 28, 2017 02:15 AM EDT Reads: 3,127
SYS-CON Events announced today that MobiDev, a client-oriented software development company, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software company that develops and delivers turn-key mobile apps, websites, web services, and complex softw...
Mar. 28, 2017 02:00 AM EDT Reads: 3,902
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
Mar. 28, 2017 01:00 AM EDT Reads: 2,359
What sort of WebRTC based applications can we expect to see over the next year and beyond? One way to predict development trends is to see what sorts of applications startups are building. In his session at @ThingsExpo, Arin Sime, founder of WebRTC.ventures, will discuss the current and likely future trends in WebRTC application development based on real requests for custom applications from real customers, as well as other public sources of information,
Mar. 28, 2017 12:45 AM EDT Reads: 994
Interoute has announced the integration of its Global Cloud Infrastructure platform with Rancher Labs’ container management platform, Rancher. This approach enables enterprises to accelerate their digital transformation and infrastructure investments. Matthew Finnie, Interoute CTO commented “Enterprises developing and building apps in the cloud and those on a path to Digital Transformation need Digital ICT Infrastructure that allows them to build, test and deploy faster than ever before. The int...
Mar. 28, 2017 12:00 AM EDT Reads: 1,257