Expert Answer:PUBH8545 Walden Analyzing Changing Data And Defini


Solved by verified expert:Data collection and dissemination may present challenges to those working in the field. However, knowledge derived from its analysis and interpretation will provide the basis for crucial events at the management level, such as planning and program evaluation, policy development and advancement, and the overall advancement of the public health agenda at every level.For this Assignment, you will analyze how public health data and definitions have changed over the past decade, and you will evaluate the consistency and reliability of this data.Review the article: Why Do Americans Have Shorter Life Expectancy and Worse Health Than Do People in Other High-Income Countries? By Mauricio Avendano and Ichiro Kawachi (2014) at a 2- to 3-page analysis and your critique of the paper (not including title page, references, or tables). Include the following:What data sources were used in this study?Did the authors use secondary data or collected primary data?What variables (dependent, independent, and covariates) were included?What specific trend is being examined?Did the authors find any specific trend? and if yes, what you can say about the future of this condition based on their trend analysis?Define validity and reliability and explain if you have any concern about the reliability and validity of data used in this study.References Why Do Americans Have Shorter Life Expectancy and Worse Health Than Do People in Other High-Income Countries? By Mauricio Avendano and Ichiro Kawachi (2014) at, L., & Johnson, J. A. (Eds.). (2014). Novick & Morrow’s public health administration: Principles for population-based management (3rd ed.). Burlington, MA: Jones & Bartlett.Chapter 15, “Public Health Surveillance”

Unformatted Attachment Preview

Don't use plagiarized sources. Get Your Custom Essay on
Expert Answer:PUBH8545 Walden Analyzing Changing Data And Defini
Just from $10/Page
Order Essay

Combining Surveillance Systems: Effective Merging of
U.S. Veteran and Military Health Data
Julie A. Pavlin1*, Howard S. Burkom2, Yevgeniy Elbert2, Cynthia Lucero-Obusan3, Carla A. Winston3,
Kenneth L. Cox4, Gina Oda3, Joseph S. Lombardo2, Mark Holodniy3
1 Armed Forces Health Surveillance Center, US Department of Defense, Silver Spring, Maryland, United States of America, 2 Applied Physics Laboratory,
Johns Hopkins University, Laurel, Maryland, United States of America, 3 Office of Public Health, US Department of Veterans Affairs, Washington, District of
Columbia, United States of America, 4 US Army Institute of Public Health, US Department of Defense, Aberdeen Proving Ground, Maryland, United States of
Background: The U.S. Department of Veterans Affairs (VA) and Department of Defense (DoD) had more than 18
million healthcare beneficiaries in 2011. Both Departments conduct individual surveillance for disease events and
health threats.
Methods: We performed joint and separate analyses of VA and DoD outpatient visit data from October 2006 through
September 2010 to demonstrate geographic and demographic coverage, timeliness of influenza epidemic
awareness, and impact on spatial cluster detection achieved from a joint VA and DoD biosurveillance platform.
Results: Although VA coverage is greater, DoD visit volume is comparable or greater. Detection of outbreaks was
better in DoD data for 58% and 75% of geographic areas surveyed for seasonal and pandemic influenza,
respectively, and better in VA data for 34% and 15%. The VA system tended to alert earlier with a typical H3N2
seasonal influenza affecting older patients, and the DoD performed better during the H1N1 pandemic which affected
younger patients more than normal influenza seasons. Retrospective analysis of known outbreaks demonstrated
clustering evidence found in separate DoD and VA runs, which persisted with combined data sets.
Conclusion: The analyses demonstrate two complementary surveillance systems with evident benefits for the
national health picture. Relative timeliness of reporting could be improved in 92% of geographic areas with access to
both systems, and more information provided in areas where only one type of facility exists. Combining DoD and VA
data enhances geographic cluster detection capability without loss of sensitivity to events isolated in either population
and has a manageable effect on customary alert rates.
Citation: Pavlin JA, Burkom HS, Elbert Y, Lucero-Obusan C, Winston CA, et al. (2013) Combining Surveillance Systems: Effective Merging of U.S.
Veteran and Military Health Data . PLoS ONE 8(12): e84077. doi:10.1371/journal.pone.0084077
Editor: Martyn Kirk, The Australian National University, Australia
Received June 14, 2013; Accepted November 20, 2013; Published December 26, 2013
This is an open-access article, free of all copyright, and may be freely reproduced, distributed, transmitted, modified, built upon, or otherwise used by
anyone for any lawful purpose. The work is made available under the Creative Commons CC0 public domain dedication.
Funding: This work was financially supported by intramural funding from the Departments of Veterans Affairs and Defense through the Joint Incentive
Fund. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Competing interests: The authors have declared that no competing interests exist.
* E-mail:
incorporating a national perspective to provide a broad picture
across regions or jurisdictions.
In 2011, the Department of Veterans Affairs (VA) system had
8.6 million enrolled Veteran patients and 79.8 million outpatient
visits [2] and the Department of Defense (DoD) Military Health
System had 9.7 million eligible beneficiaries and 38 million
outpatient visits [3]. The Departments are among the largest
healthcare systems providing care in the United States. The
Departments use distinct versions of a system known as the
Electronic Surveillance System for the Early Notification of
Community-Based Epidemics (ESSENCE) and also report
surveillance data to BioSense, a national biosurveillance
platform provided by the Centers for Disease Control and
The availability of accurate and timely information is a critical
element needed for an effective response to disease threats,
both naturally occurring events such as influenza epidemics
and potential terrorist attacks involving biological agents, such
as occurred in the United States with the anthrax mailings [1].
Protecting the health and safety of the public requires a wellintegrated national biosurveillance enterprise. Among other
data sources, biosurveillance includes the collection and
analyses of human health indicators such as healthcare
utilization. Effective biosurveillance systems leverage robust
data collection and analyses performed at the local level, while
December 2013 | Volume 8 | Issue 12 | e84077
Combined Veteran and Military Health Surveillance
Coverage determination
Prevention (CDC) and administered by the Association of State
and Territorial Health Officers [4]. However, DoD and VA data
are evaluated separately for threats to health, and data suitable
for population-based health surveillance are not routinely
shared between these agencies.
Recently, a retrospective analysis combining data from VA
and DoD facilities in North Chicago, IL, was used to examine
seasonal influenza trends, a gastrointestinal (GI) illness
epidemic and a heat-related event. This provided the first
example of the possible benefits of performing joint VA and
DoD biosurveillance [5]. Then, in July of 2012, the White House
released the first-ever National Strategy for Biosurveillance
where President Barack Obama called for “a coordinated
approach that brings together Federal, State, local, and tribal
governments…” [6]. The strategy proposes “to integrate and
enhance national biosurveillance efforts,” and outlines four
focus areas as enablers for strengthening biosurveillance.
These include: integrate biosurveillance capabilities, such as
through information sharing arrangements and across
traditional organizational lines; build capacity across our
distributed national biosurveillance architectures; foster
innovation such as combining information to project what is
likely to transpire; and strengthen partnerships through the
sharing of information between and among Federal, State,
local, tribal, territorial, private, nongovernmental, academic and
other participants [6].
Based on this recognized need, we submitted a proposal to
the Joint Incentive Fund, a program designed to facilitate the
coordination, use, or exchange of health care resources, with
the goal of improving access to, and quality and cost
effectiveness of, the health care provided to beneficiaries of
both the DoD and VA [7]. Our aim was to determine if we could
improve early detection and situational awareness of health
events of regional and national significance through a
consolidated system that maximizes both the sample size and
diversity of the populations monitored and reflects the spirit and
goals of improving integration and coordination as outlined in
the National Biosurveillance Strategy. Here we describe the
geographic coverage and outpatient visit characteristics of the
two populations, the relative timeliness in detection of the two
systems for seasonal influenza epidemics by region for both
separate and combined data, and the effect of combining
systems on spatial cluster detection. We demonstrate the utility
of a joint DoD and VA public health surveillance system that
can inform not only the two Departments, but the nation’s
public health awareness.
We used the US Office of Management and Budget’s corebased statistical area (CBSA) to group healthcare data from
the respective VA and DoD systems by megapolitan (>1 million
population), metropolitan (50,000-1 million population) and
micropolitan (10,000-50,000 population) areas. We performed
frequency analyses and mapped coverage of the VA and DoD
medical systems in these CBSAs by number of hospitals,
number of clinics, counts of all visits, and counts by age for
0-17, 18-44, 45-64 and over 65 years.
Comparability of systems for temporal detection
To assess temporal detection patterns in VA and DoD data,
we retrospectively evaluated two influenza epidemics that were
expected to affect covered populations in many CBSAs. The
first event was seasonal influenza dominated by an H3N2
strain that began in December 2007 and continued until early
April 2008. This event was more severe than other seasonal
outbreaks since 2005, especially among the elderly. The
second event was the fall wave of the novel H1N1 influenza
pandemic in 2009. The fall wave was classified similarly as
moderately severe by the CDC [8]; however, the burden of
morbidity was unusually high among children and young adults
We included the following clinic types for analysis: internal
medicine, pediatrics, adolescent medicine, family practice,
primary care, urgent care, geriatric clinics, women’s primary
care, infectious diseases and emergency medicine. We
grouped visits into an influenza-like-illness (ILI) syndrome
category if the assigned ICD-9 code(s) for the visit matched a
defined ICD-9 code for the ILI syndrome group. The ICD-9
codes used to determine inclusion in the ILI syndrome group in
ESSENCE were originally selected from a study correlating
ICD-9 codes with positive influenza laboratory tests [10]. To
choose an ICD-9 set for this current analysis, we analyzed
code frequencies in each Department’s system to determine
relative frequency and comparability and chose a common ILI
classification (Table S1 in File S1).
We formed weekly CBSA counts by summing the ILI-related
visit counts from all facilities within the CBSA. We applied
ESSENCE alerting algorithms [11] to weekly CBSA-level
outpatient data and analyzed the two data streams (DoD and
VA) separately. We expressed timeliness in weeks rather than
days to reduce the effects of reporting or system acquisition
delays. We applied the default ESSENCE alerting algorithm
[11] to all time series of weekly ILI counts for all CBSAs with
both VA and DoD healthcare facilities for the four years. For
meaningful timeliness comparison, we restricted the analysis to
the 93 CBSAs with at least two ILI-related visits per week in
both DoD and VA datasets, primarily megapolitan and
metropolitan areas. We chose November 1, 2007, to March 31,
2008 (2007–2008 seasonal influenza), and September 1, 2009,
to December 31, 2009 (2009 novel H1N1 pandemic fall wave)
as representative moderately severe influenza outbreaks [8]. In
brief, for derived time series with systematic or cyclic behavior,
the ESSENCE algorithm defaults to regression models
including seasonal and day of week effects to calculate
expected counts, and produces alerts when visit counts are
Materials and Methods
Electronic International Classification of Diseases, 9th
Revision (ICD-9) codes from outpatient medical visits from all
DoD and VA facilities for October 1, 2006, to September 30,
2010 (Fiscal Years 2007–2010) were available for analysis
from the Departments’ electronic health records. The dataset
included over 137.7 million DoD visits from 131 hospitals and
ambulatory care centers which include 362 separate clinics and
over 253 million VA visits from 128 hospitals and 786 clinics.
December 2013 | Volume 8 | Issue 12 | e84077
Combined Veteran and Military Health Surveillance
statistically higher than expected. For sparser time series, an
adaptive control chart is the default. For both influenza
epidemics, we compared the date of the first alert occurring at
the p<0.01 statistical significance level and the number of significant alerts for each CBSA in each of the DoD and VA outpatient datasets. visits for all CBSA levels (253 million vs. 137 million), the DoD visit volume exceeded the VA visit volume in 67 of the 132 CBSAs covered by facilities in both systems (Table S3 in File S1) . Patient age distribution differed sharply, with >85% of the
VA patients being over 45 years of age compared to 22% of
DoD patients. For all CBSAs, the overall VA/DoD visit ratio was
1.92, but the ratio for 0-17 years was 0.004, 18-44 years 0.33,
45-64 years 5.20 and >65 years 11.63.
Comparability of systems for cluster detection
To analyze VA and DoD surveillance data for potential
disease outbreaks in space and time, we calculated spatial
scan statistics using software introduced in [12] and identified
statistically significant clusters of events. Outpatient visit
records from both systems contain a patient zip code field that
is completed in over 94% of records. Using records grouped
using the patient zip code field, we estimated visit distributions
for both DoD and VA datasets and applied a published spatial
scan statistics implementation [10] to the separate and
combined datasets.
Input data files in the combined and separate VA and DoD
analyses were matrices of daily ILI or gastrointestinal (GI)
syndrome visit counts with a common ICD-9 code grouping
chosen for both VA and DoD systems (Tables S1-S2 in File
S1). GI syndrome codes were standard codes historically used
by DoD and VA ESSENCE. Table S2 in File S1 provides the GI
codes and indicates the top 10 that we used in the medical
records in these analyses. Matrix rows were consecutive days,
columns were patient residence zip codes, and entry (i,j) was
the number of visits on day i from zip code j. We made these
files for DoD data, VA data, and combined data.
To assess the number of significant clusters determined from
combining datasets, we used data from three separate regions
– Baltimore/Washington, DC (dominated by DoD data), Los
Angeles, CA (mainly VA data) and Tampa, FL (both
represented). For each region, we ran sets of 1,672
consecutive single-day analyses for cluster determination trials
from October 1, 2006 through September 30, 2010 for ILI and
GI syndrome data. To examine background alerting rates in
these sets, we tabulated counts of clusters with p-values no
larger than 0.001 using the standard rank-based p-value
introduced in [13] for spatial scan statistics.
In addition to examining the alert rate, we performed focused
runs to detect known events without advanced knowledge
regarding what clustering to expect for outbreaks in New York
(GI, January-March 2010), California (ILI, December 2007-April
2008 and September-December 2009), and New Jersey (GI,
January-March 2010). We did not receive prior reports of any
spatial clustering in these outbreaks.
Comparability of systems for temporal detection
Tables 1 and 2 summarize the timeliness comparison for ILI
in alert weeks. Based on the curve for timeliness of detection
(Figure 2), we removed CBSAs from the analysis if they were
outliers with more than eight weeks between the DoD and the
VA alerting. Therefore, we report 77 and 79 of the CBSAs for
the 2007-2008 season influenza outbreak and 2009 pandemic,
respectively. We detected seasonal 2007-2008 influenza at
significant alert levels in at least one of the systems in all
CBSAs, and detected the fall 2009 pandemic influenza wave in
76 of 79 included in the analysis (96.2%). Detection occurred in
the same week for both VA and DoD in only six CBSAs (7.8%)
for the seasonal event and in only five CBSAs (6.3%) for the
pandemic (Figure 2). When we found a timeliness difference,
detection in the DoD data tended to occur earlier, but not
uniformly so (Table 1). Considering the 2007-2008 season for
all analyzed CBSAs, detection occurred earlier or only in DoD
data for 45 CBSAs (63.4%) and in VA data for 26 CBSAs
(36.6%, significantly different at the 95% confidence level). For
the pandemic influenza year that affected primarily younger
patients, DoD data detected earlier or only for 59 CBSAs
(74.7%), but 12 CBSAs (15.2%) still detected earlier or only in
VA data, and for this event the earlier DoD detections occurred
in all CBSA sizes. Restriction to CBSAs with higher patient
loads (more than 10 ILI visits/week) gave similar results (Table
2). Overall, DoD tended to alert earlier across all influenza
seasons examined; however, VA data alerted earlier in a
greater number of CBSAs during the typical H3N2 seasonal flu
when compared to the 2009 H1N1 influenza pandemic.
To describe delay lengths, defined as alerting timeliness
difference between DoD and VA data, Figure 2 presents the
distribution of week alerted in DoD data minus week alerted in
VA data for the two selected outbreaks. For the H3N2 seasonal
influenza, eight CBSAs alerted four weeks earlier in DoD data
than VA data but for the H1N1 fall wave, only three CBSAs
alerted with the same four week advantage to the DoD data.
Conversely, seven CBSAs alerted three weeks earlier in VA
data for the H3N2 epidemic but only two such CBSAs for the
H1N1 pandemic. The fact that both bar plots are skewed to the
left shows a general timeliness advantage to the DoD data.
The delays for VA data occurred one to four weeks for most
CBSAs with some longer delays. Again, the timeliness
advantage was not uniform; for the 2007-2008 influenza
season, a one to four week timeliness advantage occurred in
the VA for a total of 19 CBSAs. For the pandemic influenza
2009 effect, whose burden was mainly among young adults,
the delays are more clearly skewed towards earlier alerting
within the DoD system.
Coverage determination
We identified a total of 939 CBSAs, with generally diffuse
geographic coverage by VA facilities and higher concentration
in larger metro and mega areas for DoD facilities (Figure 1). Of
the 51 mega CBSAs, all had at least one VA facility and 63%
had a DoD facility (Table S3 in File S1). Coverage was sparser
for the metro CBSAs and lighter still for the micro CBSAs.
While the VA coverage was greater in terms of total number of
December 2013 | Volume 8 | Issue 12 | e84077
Combined Veteran and Military Health Surveillance
Figure 1. Department of Defense and Veterans Affairs medical facilities in the continental United States. Locations of VA
hospitals (large green circles) and clinics (small green circles) and DoD hospitals (large blue triangles) and clinics (small blue
triangles) in the continental US.
doi: 10.1371/journal.pone.0084077.g001
Table 1. Alerting timeliness comparison of Department of Defense (DoD) and Veterans Affairs (VA) outpatient records
associated with influenza-like illness (ILI) during two different influenza events and both events combined for Core-Based
Statistical Areas (CBSAs) that have both DoD and VA facilities and have at least 2 ILI patients/week stratified by alerting
Timeliness (weeks)
Seasonal Influenza 2007-08 H3N2
Pandemic 2009 Fall H1N1
Both Influenza Events
Alert Same Week
DoD Alerts Earlier
VA Alerts Earlier
Only DoD Alerts
Only VA Alerts
Neither Alerts
doi: 10.1371/journal.pone.0084077.t001
Table 2. Alerting timeliness comparison of Department of Defense (DoD) and Veterans Affairs (VA) outpatient records
associated with influenza-like illness (ILI) during two different influenza events and both events combined for Core-Based
Statistical Areas (CBSAs) that have both DoD and VA facilities and have at least 10 ILI patients/week stratified by alerting
Timeliness (weeks)
Seasonal Influenza 2007-08 H3N2
Pandemic 2009 Fall H1N1
Both Influenza Events
Alert Same Week
DoD Alerts Earlier
VA Alerts Earlier
Only DoD Alerts
Only VA Alerts
Neither Alerts
doi: 10.1371/journal.pone.0084077.t002
December 2013 | Volume 8 | Issue 12 | e84077
Combined Vetera …
Purchase answer to see full

Place your order
(550 words)

Approximate price: $22

Calculate the price of your order

550 words
We'll send you the first draft for approval by September 11, 2018 at 10:52 AM
Total price:
The price is based on these factors:
Academic level
Number of pages
Basic features
  • Free title page and bibliography
  • Unlimited revisions
  • Plagiarism-free guarantee
  • Money-back guarantee
  • 24/7 support
On-demand options
  • Writer’s samples
  • Part-by-part delivery
  • Overnight delivery
  • Copies of used sources
  • Expert Proofreading
Paper format
  • 275 words per page
  • 12 pt Arial/Times New Roman
  • Double line spacing
  • Any citation style (APA, MLA, Chicago/Turabian, Harvard)

Our guarantees

Delivering a high-quality product at a reasonable price is not enough anymore.
That’s why we have developed 5 beneficial guarantees that will make your experience with our service enjoyable, easy, and safe.

Money-back guarantee

You have to be 100% sure of the quality of your product to give a money-back guarantee. This describes us perfectly. Make sure that this guarantee is totally transparent.

Read more

Zero-plagiarism guarantee

Each paper is composed from scratch, according to your instructions. It is then checked by our plagiarism-detection software. There is no gap where plagiarism could squeeze in.

Read more

Free-revision policy

Thanks to our free revisions, there is no way for you to be unsatisfied. We will work on your paper until you are completely happy with the result.

Read more

Privacy policy

Your email is safe, as we store it according to international data protection rules. Your bank details are secure, as we use only reliable payment systems.

Read more

Fair-cooperation guarantee

By sending us your money, you buy the service we provide. Check out our terms and conditions if you prefer business talks to be laid out in official language.

Read more

Order your essay today and save 30% with the discount code ESSAYSHELP