CTSI: Evaluation Program
|Nancy R Lowitt M.D., M.ED.
Academic Title: Assistant Professor
Primary Appointment: Medicine
Secondary Appointments: Administration
Administrative Title: Associate Dean
Location: Bressler Research Building, 14-015
Phone: (410) 706-3681
|Frank Palumbo, B.S., M.S., Ph.D., J.D.
Center on Drugs and Public Policy (CDPP)
Saratoga Building, Room 12TH FLR
Specific Aim 1: To provide consultation, support, and assistance to UM CTSI Resource and Program Centers as they develop and refine their respective evaluation strategies and processes to achieve both short- and long-term aims.
Specific Aim 2: To track data and monitor the progress of each Resource and Program Center, to facilitate interaction related to outcomes measurement, providing independent periodic reports to the UM CTSI Internal Advisory Board and External Advisory Board.
Specific Aim 3: To track the integrated themes or aims of the UM CTSI program with outcomes data from the Resource and Program Centers and with outcomes strategies designed to address the broader transformational impact of this program on our scientific communities, partners, and collaborative work.
Specific Aim 4: To develop links with other CTSAs and the National CTSA Evaluation Plan, sharing tools and strategies with others and identifying examples of successful strategies and tools for use by our Resource and Program Centers.
The Evaluation Team, co-led by UM School of Medicine (SOM) Associate Dean for Professional Development Nancy Ryan Lowitt, MD, and Francis B. Palumbo, PhD, JD, Professor and Executive Director of the School of Pharmacy Center on Drugs & Public Policy, will include the leaders of each UM CTSI Resource or Program Center and a Program Manager. Expertise in statistical analysis will be provided by Larry Magder, PhD and Mei-Ling Lee, PhD, Directors of the CTSI Center for Design, Biostatistics and Quantitative Research (DBQR). Expertise in Web site development and maintenance will be provided by Larry Roberts, MS, Director of Web Communications, SOM. This group will work together using the following strategies to fulfill the specific aims of this key function of the UM CTSI.
Specific Aim 1:
Outcomes assessment for this aim will be based on: (1) documentation of Resource or Program Center-specific processes for assessing progress toward aims; (2) documentation of Center-specific formative quality improvement feedback loops for years 1–5 of the project; and (3) documentation of Center-specific summative evaluation (degree to which aims have been achieved) and evidence of inter-Center integration and synergy for years 4–5. Each of the UM CTSI Resource or Programs has identified aims and the Evaluation Team has begun to work with them to refine these aims and the choices of evaluation methods and tools. This work will continue through the first year of the project, led by Dr. Lowitt and Dr. Palumbo. The Evaluation Team will assist resources/programs in developing continuous quality process assessment methods. These will be collected, reviewed and reported periodically to the Resource and Program Center Evaluation Team and UM CTSI leadership, with midcourse corrections and follow-up assessment tracking methods provided as needed. The Evaluation Team will provide guidance to individual resources/programs with their aims-related outcomes measures. The Team will work intensively with resources/programs and UM CTSI leadership in the first year of the project to define the network of intermediate and long-term outcomes measurement strategies that 1) will best provide data in later years on the success of the resources’/programs’ processes and 2) identify where successful inter-Center integration and synergy have occurred. These activities will include helping resources/programs identify unmet needs and assess user satisfaction with resources and services provided by CTSI resources/programs.
The Evaluation Team will provide periodic educational updates for each resource/program so that resources/programs and the Evaluation Team and CTSI investigators share a common language and frame of reference for process and outcomes evaluation. These live educational updates will be coupled with Web-based training and individual follow-up as needed.
Specific Aim 2:
Outcomes assessment for this aim will be based on documentation of periodic reports and progress via an interactive evaluation page on the UM CTSI HARBOR Web portal. The Evaluation Team will collect periodic report data from the individual Resource and Program Centers, track the reporting of feedback to each Center’s leadership, and track quality improvement strategies developed by the Centers in response to evaluation feedback. The Team will assess the effectiveness of each Center’s assessment strategies and report these semiannually and as requested to the UM CTSI Internal Advisory Committee and External Advisory Committee. The tracking of quarterly reports and feedback to Center leadership will be performed by Robertha Simpson, Program Manager, and overseen by Drs. Lowitt and Palumbo. In the event that a major deficiency or other evaluation/quality issue arises, the Evaluation Team will have ready access to the monthly CTSI Executive Steering and Integration Committee (as serving members) to make an ad hoc report. In keeping with its aim to establish a transparent evaluation and oversight process that is grounded firmly in the aims of each UM Center, the Evaluation Team will develop, in the first year of the project, a Web site for tracking and reporting the aims, outcomes, and quality improvement strategies of each Center. The site will provide a location for data reporting and storage, as well as for sharing interim results for internal use by Centers. The site will also serve as a resource for the Evaluation Team to share information and examples of evaluation tools that may be disseminated by other CTSAs. The site will also serve to receive anonymously-submitted feedback from any member of the University community or the public related to CTSA performance or outcomes that will be reviewed and forwarded to the PIs. Using the Clinical and Translational Research Informatics (CTRIP) program, the Evaluation Team will provide the resources/programs with the information technology support they will require to efficiently report their process and outcomes data and to track their own outcomes and the outcomes of other Centers. We will be able to post evaluation tools and strategies for validating tools where necessary and provide links to other sites where Center or Program staff may seek additional expertise. A well-designed Web page will provide the Evaluation Team with a clear picture of the work of each Center in progress, as well as an overview of a network of interrelated metrics that assess performance by the UM CTSI as a whole. The Web site will provide a location for posting interim results and a library of resources and links to the larger CTSA Evaluation community.
Specific Aim 3:
Outcomes assessment for this aim will be based on documentation of Evaluation Team and project team leadership discussions and assessment of data supporting integrated aims or themes of the proposal, as well as on submission of products to the National CTSA Evaluation Plan. We will incorporate direct feedback from the community obtained from Resource or Center-specific evaluations, periodic outreach efforts and web-based anonymous feedback.From this network of interrelated metrics, the Evaluation Team will identify evaluation processes to address the integrated goals and mission of the proposal. Drs. Lowitt and Palumbo will work closely with the leaders of each Resource or Program Center and with the PIs, in the first 2 years of the project to develop these processes fully. The early focus will be specifically on the mature project aims of promoting innovation in multidisciplinary research, bidirectional process oversight and interdisciplinary research. We anticipate that this will benefit from best-practices models in development at other CTSAs. We further anticipate that a logic model combining Center-based continuous quality assessment and monitoring with leadership oversight of integrated aims of the proposal may provide an innovative contribution to the National CTSA Evaluation Plan.
Specific Aim 4:
Outcomes assessment for this aim will be based on demonstration of Web site links to products available in year 1 of our project, on participation in all National CTSA Evaluation Plan conferences, and on documentation of periodic live and Web-based educational programs based on disseminated information and best practices from the National CTSA Evaluation Plan. The UM CTSI Evaluation Team, led by Drs. Lowitt and Palumbo, will continue to follow the activities and standards set by the national CTSA Evaluation Steering Committee. The Evaluation Team is prepared to and expects to participate in efforts to coordinate with the national CTSA Evaluation Steering Committee, contribute to the development of a consensus on national metrics for evaluation, adopt common terms and definitions as proposed by the CTSA Evaluation Steering Committee, and participate in information exchange of best practices and challenges. We anticipate participating in national conferences and Web casts and providing local updates to our investigators and evaluators through live and Web-based dissemination strategies.