|By Business Wire||
|February 8, 2013 11:23 AM EST||
Fitch Ratings has assigned an 'A' rating to AT&T Inc.'s (AT&T) offering of $2.25 billion of senior unsecured notes due 2016. The offering consists of $1 billion 0.9% fixed rate notes and $1.25 billion of floating rate notes. Proceeds are expected to be used for general corporate purposes. AT&T's Issuer Default rating (IDR) is 'A', and the Rating Outlook is Negative.
Key Rating Drivers
The rating is supported by:
--AT&T's financial flexibility;
--The company's diversified revenue mix;
--Its significant size and economies of scale as the largest telecommunications operator in the U.S.; and
--Fitch's expectation that AT&T will benefit from continued growth in wireless operating cash flows.
The following concern is embedded in the rating:
--The Negative Outlook reflects Fitch's expectation that AT&T's net leverage is likely to move up to a 1.8x upper boundary for leverage, which represents a notable increase from the 1.5x level over the past couple of years.
AT&T's increased leverage is expected to arise from the combined effects of a moderate increase in wireless and wireline capital spending and the continuation of the company's share repurchase program as announced in early November 2012. Prospective leverage expectations are subject to uncertainty caused by the rate of stock repurchases, actual capital expenditure levels, possible acquisitions (such as longer-term spectrum needs) and asset divestitures (of which there are none in Fitch's expectations).
In January 2013, AT&T announced two transactions to improve its wireless spectrum position. Subject to regulatory approval, AT&T will acquire certain rural wireless assets and spectrum from Atlantic Tele-Network, Inc. for $780 million in cash. Additionally, the company has a pending transaction to acquire certain B-block 700 MHz licenses from Verizon Wireless for $1.9 billion in cash and certain Advanced Wireless Services spectrum licenses in several markets. In Fitch's view, the proposed acquisitions are strategically sound as they are supportive of wireless growth. The transactions, if completed, are not currently expected to push the company's credit metrics above the 1.8x net leverage level.
For 2013, Fitch expects AT&T's gross leverage to approximate 1.7x, flat with 2012 (excluding the actuarial losses on its benefit plans). Net leverage in 2012 was 1.58x. Over the next few years, AT&T's continuation of stock repurchases will require some borrowing as repurchases will be above FCF levels. Leverage will rise, with net leverage expected to peak near a 1.8x upper boundary in 2014. Thereafter, leverage is expected to decline over time.
In Fitch's view, liquidity is strong and provided by the company's FCF; additional financial flexibility is provided by availability on the company's revolving credit facilities. At Dec. 31, 2012, total debt outstanding was approximately $69.8 billion, a $5 billion rise from the $64.8 billion outstanding at the end of 2011. Of the total amount outstanding, $3.5 billion consists of debt due within one year, including debt that can be put to the company. At Dec. 31, 2012, cash amounted to $4.9 billion, and for 2012, AT&T produced $9.2 billion in FCF (net cash provided by operating activities less capital expenditures and dividends), an amount short of the $12.8 billion in stock repurchases during the year. Fitch expects FCF to decline from $9.2 billion in 2012 to $4 billion annually, on average, over the next three years.
At end of 2012, the company did not have any drawings on its revolving credit facilities. The principal financial covenant for the 2016 and 2017 facilities requires debt to EBITDA, as defined, to be no more than 3x.
Relative to the company's expected free cash flows, upcoming debt maturities are manageable. In 2013, debt maturities approximate $3.4 billion, including approximately $1.6 billion in debt that may be put to the company. Maturities amount to $3.8 billion in 2014.
The Rating Outlook could be revised to Stable if:
--The company steadily manages net leverage down from Fitch's expected peak just under 1.8x in 2014;
--Fitch believes leverage will not reach peak levels as a result of the outcome of the following factors, including, but not limited to, stronger operating results, lower capital spending, and the effect of any acquisitions or divestitures that may occur.
A negative rating action could occur if:
--Net leverage remains above (or is expected to remain above) the 1.8x level for several quarters, including expected leverage resulting from a material transaction;
--Fitch believes management has weakened its commitment to returning to, or operating longer-term with, leverage at a level more reflective of the rating.
Additional information is available at 'www.fitchratings.com'. The ratings above were solicited by, or on behalf of, the issuer, and therefore, Fitch has been compensated for the provision of the ratings.
Applicable Criteria and Related Research:
--'Corporate Rating Methodology' (Aug. 8, 2012);
--'Rating Telecom Companies - Sector Credit Factors' (Aug. 9, 2012).
Applicable Criteria and Related Research:
Corporate Rating Methodology
Rating Telecom Companies
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
Jan. 18, 2017 01:15 AM EST Reads: 4,852
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 18, 2017 01:00 AM EST Reads: 2,001
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
Jan. 18, 2017 01:00 AM EST Reads: 6,040
"Logz.io is a log analytics platform. We offer the ELK stack, which is the most common log analytics platform in the world. We offer it as a cloud service," explained Tomer Levy, co-founder and CEO of Logz.io, in this SYS-CON.tv interview at DevOps Summit, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 18, 2017 12:45 AM EST Reads: 6,237
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Jan. 18, 2017 12:45 AM EST Reads: 5,936
In his session at @DevOpsSummit at 19th Cloud Expo, Robert Doyle, lead architect at eCube Systems, will examine the issues and need for an agile infrastructure and show the advantages of capturing developer knowledge in an exportable file for migration into production. He will introduce the use of NXTmonitor, a next-generation DevOps tool that captures application environments, dependencies and start/stop procedures in a portable configuration file with an easy-to-use GUI. In addition to captur...
Jan. 18, 2017 12:30 AM EST Reads: 2,774
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackIQ...
Jan. 18, 2017 12:00 AM EST Reads: 7,728
"ReadyTalk is an audio and web video conferencing provider. We've really come to embrace WebRTC as the platform for our future of technology," explained Dan Cunningham, CTO of ReadyTalk, in this SYS-CON.tv interview at WebRTC Summit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 18, 2017 12:00 AM EST Reads: 2,252
One of the hottest areas in cloud right now is DRaaS and related offerings. In his session at 16th Cloud Expo, Dale Levesque, Disaster Recovery Product Manager with Windstream's Cloud and Data Center Marketing team, will discuss the benefits of the cloud model, which far outweigh the traditional approach, and how enterprises need to ensure that their needs are properly being met.
Jan. 17, 2017 11:30 PM EST Reads: 4,302
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of D...
Jan. 17, 2017 11:15 PM EST Reads: 4,881
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, John Jelinek IV, a web developer at Linux Academy, will discuss why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers...
Jan. 17, 2017 11:00 PM EST Reads: 524
The many IoT deployments around the world are busy integrating smart devices and sensors into their enterprise IT infrastructures. Yet all of this technology – and there are an amazing number of choices – is of no use without the software to gather, communicate, and analyze the new data flows. Without software, there is no IT. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Dave McCarthy, Director of Products at Bsquare Corporation; Alan Williamson, Principal ...
Jan. 17, 2017 10:30 PM EST Reads: 2,339
IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...
Jan. 17, 2017 10:30 PM EST Reads: 688
DevOps and microservices are permeating software engineering teams broadly, whether these teams are in pure software shops but happen to run a business, such Uber and Airbnb, or in companies that rely heavily on software to run more traditional business, such as financial firms or high-end manufacturers. Microservices and DevOps have created software development and therefore business speed and agility benefits, but they have also created problems; specifically, they have created software securi...
Jan. 17, 2017 09:30 PM EST Reads: 1,769
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Jan. 17, 2017 09:15 PM EST Reads: 7,538