Welcome!

News Feed Item

Heart Hospital of Austin First in Texas to Implant New Portico™ Re-sheathable Transcatheter Aortic Valve System as Part of Clinical Trial

AUSTIN, Texas, Aug. 27, 2014 /PRNewswire/ -- On Aug. 26, 2014, physicians at Heart Hospital of Austin became the first in Texas to implant the new Portico™ Re-sheathable Transcatheter Aortic Valve System—an innovative, repositionable aortic heart valve and delivery system. Heart Hospital of Austin is part of the PORTICO trial, a nationwide clinical study to examine the effectiveness of the new heart valve. The Portico valve was used during a catheter-based valve replacement procedure known as a transcatheter aortic valve replacement (TAVR), which is designed to treat patients with severe aortic stenosis who are not candidates for open-heart surgery due to advanced age, or because they are too ill or suffering from additional medical conditions. The TAVR procedure was first performed in Central Texas at Heart Hospital of Austin in February 2012.

Faraz Kerendi, M.D., a cardiothoracic surgeon at Cardiothoracic and Vascular Surgeons, and Frank Zidar, M.D., an interventional cardiologist at Heart Hospital of Austin and with Austin Heart, implanted the Portico valve. The patient, a 68-year-old man, was identified as a candidate for the new Portico valve by physicians at the Heart Valve Clinic at Heart Hospital of Austin, a clinic specifically designed to evaluate and treat patients with valvular disease and disorders. Drs. Kerendi and Zidar are principal investigators of the PORTICO trial at Heart Hospital of Austin.

"The procedure was very successful," Dr. Kerendi said. "The ability to fully resheath and precisely reposition the Portico valve prior to final valve deployment was very beneficial, as it helped achieve accurate placement and minimize procedural risk for the patient."

The Portico valve is the first fully repositionable transcatheter valve—allowing the physician to accurately place the valve at the implant site, via a catheter, or retrieve it before the valve is fully deployed and released from the delivery system. The ability to reposition the valve helps physicians place the valve more accurately, reducing the risk for patients. The self-expanding Portico valve was developed to maintain blood flow similar to that of a natural valve.

"We are excited to have the opportunity to participate in this trial, as this valve has the potential to greatly impact patients and improve their quality of life," Dr. Zidar said. "This unique valve design allows the device to be assessed after implantation and, if necessary, repositioned to optimize its function."

Artificial aortic heart valves are used to treat patients with symptomatic severe aortic stenosis, a condition in which the opening of the aortic valve becomes narrow, restricting blood flow from the heart to the body and causing the heart to work harder. Over time, the valve can become calcified, preventing it from opening and releasing blood properly. Symptoms of severe aortic stenosis include fatigue, dizziness or fainting, and chest pain or discomfort. It can also cause pressure to "back up" into the lungs, resulting in shortness of breath. Left untreated, this condition is often fatal.

Traditional heart valve replacement requires that patients undergo open-heart surgery—a more invasive approach that results in a longer recovery process. Because of this, some patients with aortic stenosis are not considered candidates for this type of treatment.

The TAVR procedure consists of inserting a valve—which is compressed down to the size of a pencil—through the groin via a catheter, up to the aorta. After the catheter is advanced through the aorta and aortic valve, the valve is positioned and then opened with a balloon. TAVR results in a shorter recovery time—one to two weeks, versus six to eight weeks for traditional surgery—and it prevents the need for physicians to utilize a heart bypass machine, allowing the patient's heart to beat on its own throughout the entire procedure.

The Portico transcatheter aortic heart valve system is limited to investigational use in the United States, and the PORTICO trial is being conducted under an Investigational Device Exemption from the U.S. Food and Drug Administration.

Heart Hospital of Austin is one of 40 sites in the United States participating in the PORTICO trial. Enrolled patients will undergo a TAVR procedure receiving either a Portico valve or another TAVR valve that is commercially available in the United States. Additionally, the trial will collect information on patients who are having a Portico valve placed inside an existing, degenerated surgical valve. This valve-in-valve registry will include patients in the trial who previously had valve replacement surgery and are now having a Portico valve placed inside an existing artificial valve without removing it.

For additional information about the Portico valve, visit SJMPortico.com, and for the PORTICO trial visit ClinicalTrials.gov.

Media Contacts:
Erin Ochoa and Lisa Candido
Elizabeth Christian & Associates Public Relations
512.472.9599

SOURCE Heart Hospital of Austin

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
CI/CD is conceptually straightforward, yet often technically intricate to implement since it requires time and opportunities to develop intimate understanding on not only DevOps processes and operations, but likely product integrations with multiple platforms. This session intends to bridge the gap by offering an intense learning experience while witnessing the processes and operations to build from zero to a simple, yet functional CI/CD pipeline integrated with Jenkins, Github, Docker and Azure...
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Today, we have more data to manage than ever. We also have better algorithms that help us access our data faster. Cloud is the driving force behind many of the data warehouse advancements we have enjoyed in recent years. But what are the best practices for storing data in the cloud for machine learning and data science applications?
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully ...
All zSystem customers have a significant new business opportunity to extend their reach to new customers and markets with new applications and services, and to improve the experience of existing customers. This can be achieved by exposing existing z assets (which have been developed over time) as APIs for accessing Systems of Record, while leveraging mobile and cloud capabilities with new Systems of Engagement applications. In this session, we will explore business drivers with new Node.js apps ...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addres...
The technologies behind big data and cloud computing are converging quickly, offering businesses new capabilities for fast, easy, wide-ranging access to data. However, to capitalize on the cost-efficiencies and time-to-value opportunities of analytics in the cloud, big data and cloud technologies must be integrated and managed properly. Pythian's Director of Big Data and Data Science, Danil Zburivsky will explore: The main technology components and best practices being deployed to take advantage...
For years the world's most security-focused and distributed organizations - banks, military/defense agencies, global enterprises - have sought to adopt cloud technologies that can reduce costs, future-proof against data growth, and improve user productivity. The challenges of cloud transformation for these kinds of secure organizations have centered around data security, migration from legacy systems, and performance. In our presentation, we will discuss the notion that cloud computing, properl...
Chris Matthieu is the President & CEO of Computes, inc. He brings 30 years of experience in development and launches of disruptive technologies to create new market opportunities as well as enhance enterprise product portfolios with emerging technologies. His most recent venture was Octoblu, a cross-protocol Internet of Things (IoT) mesh network platform, acquired by Citrix. Prior to co-founding Octoblu, Chris was founder of Nodester, an open-source Node.JS PaaS which was acquired by AppFog and ...
By 2021, 500 million sensors are set to be deployed worldwide, nearly 40x as many as exist today. In order to scale fast and keep pace with industry growth, the team at Unacast turned to the public cloud to build the world's largest location data platform with optimal scalability, minimal DevOps, and maximum flexibility. Drawing from his experience with the Google Cloud Platform, VP of Engineering Andreas Heim will speak to the architecture of Unacast's platform and developer-focused processes.
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
The vast majority of businesses now use cloud services, yet many still struggle with realizing the full potential of their IT investments. In particular, small and medium-sized businesses (SMBs) lack the internal IT staff and expertise to fully move to and manage workloads in public cloud environments. Speaker Todd Schwartz will help session attendees better navigate the complex cloud market and maximize their technical investments. The SkyKick co-founder and co-CEO will share the biggest challe...
When applications are hosted on servers, they produce immense quantities of logging data. Quality engineers should verify that apps are producing log data that is existent, correct, consumable, and complete. Otherwise, apps in production are not easily monitored, have issues that are difficult to detect, and cannot be corrected quickly. Tom Chavez presents the four steps that quality engineers should include in every test plan for apps that produce log output or other machine data. Learn the ste...
With more than 30 Kubernetes solutions in the marketplace, it's tempting to think Kubernetes and the vendor ecosystem has solved the problem of operationalizing containers at scale or of automatically managing the elasticity of the underlying infrastructure that these solutions need to be truly scalable. Far from it. There are at least six major pain points that companies experience when they try to deploy and run Kubernetes in their complex environments. In this presentation, the speaker will d...
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...