PharmaSUG will be hosting a Single Day Event (SDE) at the Hamner Conference Center, which is located in the North Carolina Biotechnology Center in Research Triangle Park, NC, on October 25, 2018.
The registration fee is $75, which can be applied as a credit towards your PharmaSUG 2019 registration. Online registration ends on October 22, 2018; registration after October 22 will be available on site.
Thanks to our Sponsors!
Presentation AbstractsAdvanced Visualization Using TIBCO Spotfire and SAS Using SDTM Data
Ajay Gupta, PPD
In Pharmaceuticals/CRO industries, you may receive requests from stakeholders for quick access to clinical data to explore the data interactively and to gain a deeper understanding. TIBCO Spotfire is an analytics and business intelligence platform which enables data visualization in an interactive mode. Users can further integrate TIBCO Spotfire with SAS (used for data programming) and create visualizations with powerful functionality e.g. data filters, data flags. These visualizations can help the user to self-review the data in multiple ways and will save a significant amount of time. This paper will demonstrate some advanced visualization from Preclarus Patient Data Dashboard (Preclarus PDD) within PPD created using TIBCO Spotfire and SAS (for the SDTM database) and share our experiences and challenges while creating this visualization.
Simplify and Streamline Using Python
Michael Stackhouse, Covance
A large deal of work goes into preparing for a submission, including the work beyond the analysis. Preparing and maintaining documentation can be tedious, and keeping track of updates can become difficult as deadlines approach. This presentation will explore how Python can simplify and streamline some of these tasks. Topics will include identifying changes in specifications, automating a hyperlinked program table of contents, de-macrotizing SAS programs, and more.
Streamlining the Metadata Management Process Using SAS Life Science Analytics Framework
Alex Ford, SAS
It was not long into my clinical programming career before I discovered that CDISC is truly an acronym for “Can Do It Somewhat Correctly”. Each run of a validation report uncovered new warnings or errors followed by tracking down the source of those issues to log and report for a define.xml. The latest release of the SAS Life Science Analytics Framework (LSAF) provides a centralized framework where standards can be imported and live alongside a study and its data, managed by a graphical user interface. By associating a data standard, controlled terminology, and dictionaries with a study, team leads have the data and information necessary to produce a define.xml at the click of a button. Join us as we explore the metadata management features available in LSAF 5.1 which enable programmers of all levels to manage data standards correctly the first time, saving studies both time and money.
ADaM 2018: What's New and What's Coming
Jack Shostak, Duke Clinical Research Institute and Sandra Minjoe, PRA
ADaM has sub-teams working on documents for public review and finalization within 2018. These include:
- ADaMIG v1.2
- ADaM Structures for Integration v1.0
- ADaMIG for Medical Devices v1.0
- ADaM OCCDS v1.1
- ADaM Traceability Examples v1.0
- ADaM BDS for Non-Compartmental (PK) Analysis v1.0
- ADaM Structures for Oncology v1.0
- IACET-approved ADaM Training Courses
ADaM Structures for Integration: A Preview
Wayne Zhong, Accretion Softworks
Integration and analysis of data across all studies in a submission is a vital part of applications for regulatory approval in the pharma industry. The existing ADaM classes (ADSL, BDS, and OCCDS) already support some simple cases of integration analysis. However, there has been a need for an integration standard that supports the more complex cases. To address this need, the ADaM Integration sub-team is developing the upcoming ADaM Integration standards document. This paper introduces the new IADSL, IBDS, and IOCCDS classes found in this document. IADSL allows for multiple records per subject. IBDS and IOCCDS work effectively with the new IADSL class. This paper also discusses the analysis needs that necessitated the creation of the new classes, and provides examples in the form of usage scenarios, data, and metadata. With them, no future integration will prove too complex. PharmaSUG 2018 Best Paper Winner
A Framework for Implementing [Conflicting] FDA Guidance
Todd Case, Vertex Pharmaceuticals
On July 21, 2004 the US Food and Drug Administration (FDA) announced a format, called the Study Data Tabulation Model (SDTM), that sponsors can use to submit data to the agency. Twelve years later (on December 17, 2016) the FDA began enforcing the requirement of standardized electronic data submissions in SDTM format, and now, in addition to SDTM, there are multiple sources (and versions) of data standards which impact data supporting applications to the FDA: the FDA Data Standards Catalog (primary list and source of standards) AND the Study Data Standardization Plan, the SDTM model (Version 1.4), the SDTM Implementation Guide (SDTMIG – Version 3.2), the Analysis Data Model (ADaM) - Version 2.1, the ADaM Implementation Guide (Version 1.1), the FDA Guidance for Industry (April, 2017), the Study Data Technical Conformance Guide (October, 2017) and the Prescription Drug User Fee Act (PDUFA), Version V for Fiscal Years 2013-2017 and VI for Fiscal Years 2018 – 2022. At times, these documents, guidances and laws can be contradictory, and it’s up to the Sponsor (when appropriate) to engage with the FDA to determine which 'standard’ (of the standards) to adapt, which version(s) to use, and when to update versions. PharmaSUG 2018 Best Paper Winner
Preparing ADaM Datasets and Related Files for FDA Submission
Ragini Hari and Sandra Minjoe, PRA
This presentation compiles information from documents produced by the U.S. Food and Drug Administration (FDA), the Clinical Data Interchange Standards Consortium (CDISC), and Computational Sciences Symposium (CSS) workgroups to identify what analysis data and other documentation is to be included in submissions and where it all needs to go. The paper not only describes requirements, but also includes recommendations for things that aren't so cut-and-dried. It applies to New Drug Application (NDA) submissions and the subset of Biologic License Application (BLA) submissions that are described in specific FDA binding guidance documents plus other related FDA documents.
Define XML Expectations from Various Clients, Tools and Industry Groups
Gustav Bernard and Zach Dorman, IQVIA
Define-XML 2.0 has been around for a while and yet we are still struggling with these, due to lack of firm instructions and expectations. During this talk, we would like to discuss different expectations from different pharma companies, what is expected by CDISC and what information is provided by CDISC. We will also present some of the P21 findings that are not in agreement with CDISC Standards, and some that cause confusion, and talk about what we feel is expected by the FDA.
CDISC Standards: Evolving to Meet Submission Needs
Diane Wold, CDISC
Although CDISC teams have sometimes organized pilots that involve submissions to FDA (SDTMl-ADaM Pilot project, SEND Fit-for-Use Pilot), for the most part CDISC is not involved in creating and submitting e-submissions. So how does CDISC contribute to solutions that make e-Submissions happen and make them matter? For the most part, CDISC contributes by developing standards, including implementation guides, rules, and controlled terminology, and improving access to the standards, as with the SHARE exports and API. The CDISC standards evolve to cover more content, to provide more detail, clarify ambiguities, and to fix problems. As CDISC standards have evolved they have proliferated.
Study Data Topics at FDA/CDER
Sara Jimenez, FDA/CDER
A sponsor’s goal should be to collect and submit study data with integrity and the highest possible quality. This presentation will build upon this event’s e-submission solutions theme and focus on the following study data topics at CDER: review issues related to data quality and format, technical rejection criteria for study data, and logically skipped instrument items.