Academic literature on the topic 'Metrics Program'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Metrics Program.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Metrics Program"
Volmar, Keith E., Shannon J. McCall, Ronald B. Schifman, Michael L. Talbert, Joseph A. Tworek, Keren I. Hulkower, Anthony J. Guidi, et al. "Professional Practice Evaluation for Pathologists: The Development, Life, and Death of the Evalumetrics Program." Archives of Pathology & Laboratory Medicine 141, no. 4 (April 1, 2017): 551–58. http://dx.doi.org/10.5858/arpa.2016-0275-cp.
Full textBurgess, Hayley, Joan Kramer, Elizabeth Hofammann, and Mandelin Cooper. "Clinical Metrics for a Large Healthcare System’s Antimicrobial Management Program." Infection Control & Hospital Epidemiology 41, S1 (October 2020): s7. http://dx.doi.org/10.1017/ice.2020.478.
Full textRamchandani, Chander, and Carolyn Buford. "IMPLEMENTING A SUCCESSFUL METRICS PROGRAM." INCOSE International Symposium 6, no. 1 (July 1996): 1036–42. http://dx.doi.org/10.1002/j.2334-5837.1996.tb02118.x.
Full textDewland, Jason C., and Andrew See. "Notes on Operations: Patron Driven Acquisitions: Determining the Metrics for Success." Library Resources & Technical Services 59, no. 1 (January 23, 2015): 13. http://dx.doi.org/10.5860/lrts.59n1.13.
Full textJabbar. P, Abdul, and S. Sarala. "Advanced Program Complexity Metrics and Measurement." International Journal of Computer Applications 23, no. 2 (June 30, 2011): 29–33. http://dx.doi.org/10.5120/2860-3679.
Full textZhang, Kang, and Narasimhaiah Gorla. "Locality metrics and program physical structures." Journal of Systems and Software 54, no. 2 (October 2000): 159–66. http://dx.doi.org/10.1016/s0164-1212(00)00059-5.
Full textHyatt, Lawrence E., and Linda H. Rosenberg. "Software metrics program for risk assessment." Acta Astronautica 40, no. 2-8 (January 1997): 223–33. http://dx.doi.org/10.1016/s0094-5765(97)00148-3.
Full textBuckley, F. J. "Standards-establishing a standard metrics program." Computer 23, no. 6 (June 1990): 85–86. http://dx.doi.org/10.1109/2.55506.
Full textFenick, S. "Implementing management metrics: an Army program." IEEE Software 7, no. 2 (March 1990): 65–72. http://dx.doi.org/10.1109/52.50775.
Full textLynn, Marilyn, Douglas Bronson, and William Gunnar. "The impact of benchmarking operating room efficiency within the Veterans Health Administration." International Journal of Healthcare 5, no. 1 (October 28, 2018): 8. http://dx.doi.org/10.5430/ijh.v5n1p8.
Full textDissertations / Theses on the topic "Metrics Program"
Hitchcock, T. L. "Metrics for object-oriented program control." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape7/PQDD_0005/MQ46256.pdf.
Full textRobison, Dawn M. 1967. "Transformational metrics for product development." Thesis, Massachusetts Institute of Technology, 2001. http://hdl.handle.net/1721.1/34724.
Full textIncludes bibliographical references (p. 113-116).
The research provides a case study of performance metrics within the framework of the product development process and team effectiveness. A comparative analysis of eight product development teams was done to evaluate the teams' effectiveness in achieving three outcomes - customer satisfaction, shareholder value and time to market. A survey was conducted to evaluate areas where no formal documentation existed and to supplement the existing historical data that were collected from databases and documents. The analysis was done on two levels - by program team and individual respondent - and looked at the level of performance and effort that influenced the specific outcomes. It was concluded that performance metrics are used within an organization to drive actions, to assess progress and to make decisions. Conclusions were consistent with the premise that people perform to how they are measured and that the team effectiveness can be driven by a set of performance metrics that are aligned with the strategic goal of the organization. Transformational metrics were developed within the framework of understanding the interdependence of the social and technical systems. Choosing the right metrics is critical to an organization's success because the metrics directly influence behavior and establish the culture within the firm. It was determined that if the right combinations of metrics are selected, teams will act in such a way as to maximize their effectiveness and behave in a manner that achieves the corporate goals.
by Dawn M. Robison.
S.M.
Johnson, John H. (John Howard) 1965. "Metrics for a platform team." Thesis, Massachusetts Institute of Technology, 2000. http://hdl.handle.net/1721.1/88321.
Full textDufour, Bruno. "Objective quantification of program behaviour using dynamic metrics." Thesis, McGill University, 2004. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=81328.
Full textIn order to make these intuitive notions of program behaviour more concrete and subject to experimental validation, this thesis introduces a methodology to objectively quantify key aspects of program behaviour using dynamic metrics. A set of unambiguous, dynamic, robust and architecture-independent dynamic metrics is defined, and can be used to categorize programs according to their dynamic behaviour in five areas: size, data structures, memory use, polymorphism and concurrency. Each metric is also empirically validated.
A general-purpose, easily extensible dynamic analysis framework has been designed and implemented to gather empirical metric results. This framework consists of three major components. The profiling agent collects execution data from a Java virtual machine. The trace analyzer performs computations on this data, and the web interface presents the result of the analysis in a convenient and user-friendly way.
The utility of the approach as well as selected specific metrics is validated by examining metric data for a number of commonly used benchmarks. Case studies of program transformations and the consequent effects on metric data are also considered. Results show that the information that can be obtained from the metrics not only corresponds well with the intuitive notions of program behaviour, but can also reveal interesting behaviour that would have otherwise required lengthy investigations using more traditional techniques.
Israel, Mark Abraham. "Heuristic program reorganization guided by object-oriented metrics." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp04/mq20976.pdf.
Full textRaiyani, Sangeeta. "Incorporating design metrics into a company-wide program." Virtual Press, 1990. http://liblink.bsu.edu/uhtbin/catkey/722468.
Full textDepartment of Computer Science
Blackburn, Craig D. (Craig David) S. M. Massachusetts Institute of Technology. "Metrics for enterprise transformation." Thesis, Massachusetts Institute of Technology, 2009. http://hdl.handle.net/1721.1/54657.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (p. 148-161).
The objective of this thesis is to depict the role of metrics in the evolving journey of enterprise transformation. To this end, three propositions are explored: (i) metrics and measurement systems drive transformation, (ii) employee engagement is a proxy to gauge transformation progress; and (iii) metric considerations enable enterprise transformation when systematically executed as part of a transformation roadmap. To explore this problem, the aerospace measurement community was consulted to help grasp a better understanding of the context in which transformation is currently defined and measured. Once the problem space was defined, the environment of doing research with the enterprise as the unit of analysis was described with the intent of exploring the role of metrics and transformation. In particular, the performance measurement literature helped identify tools and methods used to select metrics to enable decision making at the enterprise level. After this review, two case studies were performed, considering: (1) the implementation of a bottom-up measurement system to drive transformation and (2) the effect of a top-down corporate measurement system on the enterprise. The first case study revealed insights regarding the benefits and challenges of implementing measurement systems and highlighted the use of employee engagement as a proxy to measure enterprise transformation. In the second case study, contemporary measurement issues were discussed and mapped to an Eight Views of the Enterprise analysis to identify critical enterprise interactions.
(cont.) Ultimately, the Lean Advancement Initiative's Enterprise Transformation Roadmap was used as a method for depicting how performance measurement can help enable enterprise transformation. The implications of research in metrics for enterprise transformation span across thee areas: (1) the extensive literature reviews provide an academic contribution for performing enterprise and measurement research; (2) a common language and framework for exploring measurement problems is depicted for practitioners through the case study analysis; and (3) a connection between enterprise measurement and enterprise transformation is established to drive future transformation success.
by Craig D. Blackburn.
S.M.
S.M.in Technology and Policy
Russell, Keith A. (Keith Anthony) 1966. "Reengineering metrics systems for aircraft sustainment teams : a metrics thermostat for use in strategic priority management." Thesis, Massachusetts Institute of Technology, 2000. http://hdl.handle.net/1721.1/29212.
Full textIncludes bibliographical references (p. 133-134).
We explore the selection of metrics for the United States Air Force weapon system sustainment team empirically with emphasis placed on the incentive, structural and predictive implications of metrics. We define the term "metric" to include measures that employees impact through their efforts. We believe that even in a not-for-profit organization such as the Air Force, by putting emphasis (or weight) on a performance metric, the organization establishes inherent incentive structures within which employees will act to maximize their best interests. However, we believe that not-for-profit organizations differ from for-profit ones in their inherent structure since profit becomes cost and several mission-oriented outcome variables share a fundamental importance in achieving the organizations goals. We seek an understanding of the structural composition of Air Force sustainment's metrics systems that, when coupled with a method for practical selection of a high-quality set of metrics (and weights), will align the incentives of employees with the interests of the organization. The empirical study is grounded in emerging theoretical work, which uses our above definition of a metric to purpose a theoretical metrics feedback construct called the Metrics Thermostat. System structure is explored through common correlation and regression analysis as well as more sophisticated structural equation modeling and systems dynamics techniques used to explore potential feedback loops. The F- 16 is used as a case study for this problem, and the metrics systems are considered from the front-line base-level point of view of Air Force active duty, Air National Guard and Air Force Reserve bases worldwide. 96 low-level metrics, covariates and outcomes were examined for 45 F- 16 bases for a period of five years. Outcome importance was determined through personal interviews and internal archival documentation. -- The metrics, covariates and outcomes in the study are very interrelated. -- The primary indicator of overall performance is Command (ACC, USAFE, etc.) -- Increased Fix Rate increases Utilization, but increased Utilization decreases Fix Rate. -- Cannibalization Rate is associated with higher Fix Rates but lower Mission Capability, Flying Scheduling Effectiveness, and Aircraft Utilization. -- Active duty Mission Capability is predicted well from the dataset such that: * Active duty commands have higher mission capability. * Mission Capability is slightly higher in cool moist climates. * Increased Aircraft Utilization, Repeat Discrepancies and Flying Scheduling Effectiveness are all associated with higher Mission Capability. * Increased Break Rates and Unscheduled (engine) Maintenance are associated with lower Mission Capability. The model appears to be valid for peacetime actions only.
by Keith A. Russell.
S.M.
Sides, Steve P. (Steve Paul) 1963. "Driving robust jet engine design through metrics." Thesis, Massachusetts Institute of Technology, 2000. http://hdl.handle.net/1721.1/88331.
Full textDay, Henry Jesse II. "An Investigation of Software Metrics Affect on Cobol Program Reliability." Diss., Virginia Tech, 1996. http://hdl.handle.net/10919/30479.
Full textPh. D.
Books on the topic "Metrics Program"
Grady, Robert B. Software metrics: Establishing a company-wide program. Englewood Cliffs, N.J: Prentice-Hall, 1987.
Find full textIT security governance guidebook with security program metrics on CD-ROM. Boca Raton, FL: Auerbach Publications/Taylor & Francis, 2007.
Find full textImproving metrics for the Department of Defense Cooperative Threat Reduction Program. Washington, D.C: National Academies Press, 2012.
Find full textNew tools and metrics for evaluating Army distributed learning. Santa Monica, CA: RAND, 2011.
Find full textJanicak, Christopher A. Safety metrics: Tools and techniques for measuring safety performance. 2nd ed. Lanham: Government Institutes, 2009.
Find full textRomero, James S. Software metrics: A case analysis of the U.S. Army Bradley Fighting Vehicle A3 Program. Monterey, Calif: Naval Postgraduate School, 1998.
Find full textNorth Carolina. General Assembly. Program Evaluation Division. The UNC system needs a more comprehensive approach and metrics for operational efficiency: Final report to the Joint Legislative Program Evaluation Oversight Committee. [Raleigh, North Carolina]: Program Evaluation Division, 2013.
Find full textAdamov, Rade. Literature review on software metrics. [Hallbergmoos, Germany]: Angewandte InformationsTechnik, 1989.
Find full textAdamov, Rade. Literature review on software metrics. Zürich: Institut für Informatik der Universität Zürich, 1987.
Find full textPalmer, A. Jefferson. Foresters' metric conversions program (version 1.0). Radnor PA: USDA Forest Service, 1999.
Find full textBook chapters on the topic "Metrics Program"
Nathan, William Tierney. "Program Core Metrics." In Value Management in Healthcare, 249–59. Boca Raton : Taylor & Francis, 2018.: Productivity Press, 2017. http://dx.doi.org/10.1201/9781315102245-9.
Full textMartins, Pedro, Paulo Lopes, João P. Fernandes, João Saraiva, and João M. P. Cardoso. "Program and Aspect Metrics for MATLAB." In Computational Science and Its Applications – ICCSA 2012, 217–33. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-31128-4_16.
Full textVogelsang, Andreas, Ansgar Fehnker, Ralf Huuck, and Wolfgang Reif. "Software Metrics in Static Program Analysis." In Formal Methods and Software Engineering, 485–500. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-16901-4_32.
Full textJabbar, Abdul, and Sarala. "Authenticate Program Complexity Metrics Using RAA." In Communications in Computer and Information Science, 358–62. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-25734-6_54.
Full textLeBlanc, Vicki. "Evaluation, Metrics, and Measuring ROI/VOI." In Comprehensive Healthcare Simulation: Program & Center Development, 123–29. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-46812-5_17.
Full textGaikovina Kula, Raula, Kyohei Fushida, Shinji Kawaguchi, and Hajimu Iida. "Analysis of Bug Fixing Processes Using Program Slicing Metrics." In Product-Focused Software Process Improvement, 32–46. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. http://dx.doi.org/10.1007/978-3-642-13792-1_5.
Full textBodei, Chiara, Pierpaolo Degano, Gian-Luigi Ferrari, and Letterio Galletta. "Security Metrics at Work on the Things in IoT Systems." In From Lambda Calculus to Cybersecurity Through Program Analysis, 233–55. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-41103-9_9.
Full textO’Connor, Peter. "Loyalty Programs and Direct Website Performance: An Empirical Analysis of Global Hotel Brands." In Information and Communication Technologies in Tourism 2021, 150–61. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-65785-7_13.
Full textPrakash, Jyoti, Abhishek Tiwari, and Christian Hammer. "Effects of Program Representation on Pointer Analyses — An Empirical Study." In Fundamental Approaches to Software Engineering, 240–61. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-71500-7_12.
Full textChamillard, A. T. "An exploratory study of program metrics as predictors of reachability analysis performance." In Software Engineering — ESEC '95, 343–61. Berlin, Heidelberg: Springer Berlin Heidelberg, 1995. http://dx.doi.org/10.1007/3-540-60406-5_24.
Full textConference papers on the topic "Metrics Program"
"Program Committee." In 11th IEEE International Software Metrics Symposium (METRICS'05). IEEE, 2005. http://dx.doi.org/10.1109/metrics.2005.39.
Full text"Message from the General and Program Chairs." In 11th IEEE International Software Metrics Symposium (METRICS'05). IEEE, 2005. http://dx.doi.org/10.1109/metrics.2005.34.
Full textHamilton, William L. "Situation Awareness Metrics Program." In Aerospace Technology Conference and Exposition. 400 Commonwealth Drive, Warrendale, PA, United States: SAE International, 1987. http://dx.doi.org/10.4271/871767.
Full text"Program committee." In 10th International Symposium on Software Metrics, 2004. Proceedings. IEEE, 2004. http://dx.doi.org/10.1109/metric.2004.1357883.
Full text"Program Committee." In 2011 Third International Workshop on Security Measurements and Metrics (Metrisec). IEEE, 2011. http://dx.doi.org/10.1109/metrisec.2011.6.
Full textPan, Kai, Sunghun Kim, and E. Whitehead, Jr. "Bug Classification Using Program Slicing Metrics." In 2006 Sixth IEEE International Workshop on Source Code Analysis and Manipulation. IEEE, 2006. http://dx.doi.org/10.1109/scam.2006.6.
Full textKrstic, Marko, and Milan Bjelica. "Performance metrics for personalized program guides." In 2016 13th Symposium on Neural Networks and Applications (NEUREL). IEEE, 2016. http://dx.doi.org/10.1109/neurel.2016.7800131.
Full textKatzmarski, Bernhard, and Rainer Koschke. "Program complexity metrics and programmer opinions." In 2012 IEEE 20th International Conference on Program Comprehension (ICPC). IEEE, 2012. http://dx.doi.org/10.1109/icpc.2012.6240486.
Full textHarth, Eric, and Philippe Dugerdil. "Document Retrieval Metrics for Program Understanding." In FIRE '15: Forum for Information Retrieval Evaluation. New York, NY, USA: ACM, 2015. http://dx.doi.org/10.1145/2838706.2838710.
Full textRoss, Christopher P. "Predicting production metrics for unconventional shale reservoirs." In SEG Technical Program Expanded Abstracts 2019. Society of Exploration Geophysicists, 2019. http://dx.doi.org/10.1190/segam2019-3210608.1.
Full textReports on the topic "Metrics Program"
Pacheco, Patricia Marie, Ashley Endo, Lynne Schleiffarth Burks, Brandon Walter Heimer, Charles Joseph John, Trisha Hoette Miller, and Nerayo P. Teclemariam. National Hurricane Program Metrics Framework. Office of Scientific and Technical Information (OSTI), August 2016. http://dx.doi.org/10.2172/1561817.
Full textSECRETARY OF THE AIR FORCE WASHINGTON DC. Information Protection Metrics and Measurements Program. Fort Belvoir, VA: Defense Technical Information Center, August 1997. http://dx.doi.org/10.21236/ada404998.
Full textYule, H. P., and C. A. Riemer. Program for implementing software quality metrics. Office of Scientific and Technical Information (OSTI), April 1992. http://dx.doi.org/10.2172/10166209.
Full textYule, H. P., and C. A. Riemer. Program for implementing software quality metrics. Office of Scientific and Technical Information (OSTI), April 1992. http://dx.doi.org/10.2172/7233625.
Full textCraig, Philip A., J. Mortensen, and Jeffery E. Dagle. Metrics for the National SCADA Test Bed Program. Office of Scientific and Technical Information (OSTI), December 2008. http://dx.doi.org/10.2172/963242.
Full textCianciolo, Anna T. Program Evaluation Metrics for U.S. Army Lifelong Learning Centers. Fort Belvoir, VA: Defense Technical Information Center, March 2007. http://dx.doi.org/10.21236/ada465470.
Full textNone, None. Program analysis methodology Office of Transportation Technologies: Quality Metrics final report. Office of Scientific and Technical Information (OSTI), March 2002. http://dx.doi.org/10.2172/1216587.
Full textRice, Chris, and Robin Locksley. Establishing a Program for Applying Earned Value Metrics to Flight Test. Fort Belvoir, VA: Defense Technical Information Center, January 2000. http://dx.doi.org/10.21236/ada375749.
Full textPatterson, P., J. Moore, M. Singh, and E. Steiner. Program analysis methodology Office of Transportation Technologies 2003 quality metrics final report. Office of Scientific and Technical Information (OSTI), September 2002. http://dx.doi.org/10.2172/801658.
Full textDickerson, L. S. DOE Safety Metrics Indicator Program (SMIP) Third Quarter FY 2001 Quarterly Report. Office of Scientific and Technical Information (OSTI), September 2001. http://dx.doi.org/10.2172/814292.
Full text