Skip Maine state header navigation

Agencies | Online Services | Help

Skip All Navigation

> Dirigo Health Filing

STATE OF MAINE
DEPARTMENT OF PROFESSIONAL AND FINANCIAL REGULATION
BUREAU OF INSURANCE

IN RE: REVIEW OF AGGREGATE MEASURABLE COST SAVINGS DETERMINED BY DIRIGO HEALTH FOR THE FOURTH ASSESSMENT YEAR

Docket No. INS-08-900

)
)
)
)
)
)

PART II:

DECISION AND ORDER

 

Superintendent of Insurance Mila Kofman issues the following Decision and Order in the above-captioned proceeding.

I. BACKGROUND

The adjudicatory proceeding in this matter was conducted pursuant to 24‑A M.R.S.A. § 6913(1)(C); the Maine Administrative Procedure Act, 5 M.R.S.A. chapter 375, subchapter 4; 24-A M.R.S.A. §§ 229 to 236; Bureau of Insurance Rule Chapter 350; and orders issued by me.

On August 12, 2008, pursuant to 24-A M.R.S.A. § 6913(1)(B), the Board of Directors (the “Board”) of the Dirigo Health Agency (“DHA”)1 filed its annual determination of:

the aggregate measurable cost savings, including any reduction or avoidance of bad debt and charity care costs to health care providers in this State as a result of the operation of Dirigo Health and any increased MaineCare enrollment due to an expansion in MaineCare eligibility occurring after June 30, 2004.

24-A M.R.S.A. § 6913(1)(A).

The purpose of this proceeding and hearing is for me to review the Board’s filing and “issue an order approving, in whole or in part, or disapproving the filing.”  24‑A M.R.S.A. § 6913(1)(C).  I am required to “approve the filing upon a determination that the aggregate measurable cost savings filed by the board are reasonably supported by the evidence in the record.”  Id.  The Board, through DHA as the moving party, has the burden of proving that its determination of aggregate measurable cost savings is reasonably supported by the evidence in the record.

“Reasonably supported by the evidence” has been previously interpreted to refer to the totality of the evidence and not to any part of the evidence taken out of context.  In re Review of Aggregate Measurable Cost Savings Determined by Dirigo Health for the First Assessment Year (“Year One Decision”), No. INS-05-700 at p. 2 (October 29, 2005).Furthermore, it has been stated that “reasonably supported” is not equivalent to a preponderance-of-the-evidence standard.  Id.  The Board does not have to prove that its chosen alternative is the best or only alternative supported by the record, nor does it have to show that its chosen alternative is the most reasonable, but rather, the Board must show that the evidence in the record reasonably supports its alternative.  Id.
           
II. PARTIES

The Dirigo Health Agency, through its Board of Directors, is a party to the proceeding.  24-A M.R.S.A. § 6913(1)(C).  Other parties to the proceeding, pursuant to my grants of intervention, include the Maine Automobile Dealers Association Insurance Trust (“Trust”), the Maine State Chamber of Commerce (“Chamber”), the Maine Association of Health Plans (“MEAHP”), Anthem Health Plans of Maine, Inc. (“Anthem”), and Consumers for Affordable Health Care (“CAHC”).

III. PROCEDURAL HISTORY

On July 10, 2008, a Notice of Pending Proceeding and Hearing was issued, among other matters setting the intervention deadline and contingent hearing dates.  The July 19th Order also included initial procedures for the conduct of the proceeding.  By Order issued August 18, 2008, September 9, 2008 was established as the date for the public hearing.

On August 12, 2008, the Board, through its counsel, Assistant Attorney General William Laubenstein, submitted the Board’s filing.  The filing consists of the Board’s August 11, 2008, written decision and a copy of the complete administrative record of the proceeding before the Board, In re Determination of Aggregate Measurable Cost Savings for the Fourth Assessment Year (2009).  On August 29, 2008, the Board filed an amended page 4 of the Board’s decision.  This administrative record is voluminous and has been made available for public inspection at the offices of the Bureau of Insurance in Gardiner, Maine throughout this proceeding.  All other filings made by the parties and the Superintendent’s interlocutory rulings and orders have been posted throughout the proceeding to the Bureau’s web page at www.maine.gov/insurance for public access and inspection.

On August 15, 2008, the Trust filed a motion to recuse Superintendent Kofman from any further participation in this matter, which was denied by Order issued August 25, 2008.

Intervention was granted by Order issued August 18, 2008, to the Trust represented by Bruce Gerrity, Esq.; the Chamber represented by William Stiles, Esq.; MEAHP represented by D. Michael Frink, Esq.; Anthem represented by Christopher Roach, Esq.; and CAHC represented by Joseph Ditré, Esq.  DHA was represented by Assistant Attorney General Michael Colleran.  The August 18th Order also included further procedures for the conduct of the proceeding.
 
On August 26, 2008, all intervenor parties filed separate briefs.  DHA’s brief was filed on September 2, 2008.  All intervenor parties filed separate reply briefs on September 5, 2008. 

An Order Regarding the Record was issued on August 29, 2008 seeking additional information from DHA and the Chamber, to which DHA and the Chamber separately responded on September 2, 2008.

A Scheduling Order was issued on September 5, 2008 setting forth the procedure for oral argument at hearing and the order of issues to be addressed at the hearing.

The hearing was held in Augusta, Maine on September 9, 2008.  The hearing was conducted entirely in public session.  Counsel for each of the parties presented oral argument at the hearing.  At the conclusion of the hearing, written Hearing Questions for Citations to the Record were distributed, to which all parties filed separate responses on September 11, 2008.

On September 23, 2008, a Part I Decision and Order was issued setting forth the result of this proceeding with a summary of the Superintendent’s factual conclusions.  That preliminary decision and order was issued pursuant to Bureau of Insurance Rule 350, § 18, which authorizes the issuance of a two-part decision in “extraordinary circumstances, including those in which a time constraint imposed by rule or statute requires the issuance of a decision by a specific date.”  This Part II Decision and Order incorporates the complete statement of findings of fact supporting the decision and the detailed procedural history.

IV. DISCUSSION, ANALYSIS, FINDINGS, AND CONCLUSIONS

The Board’s filing itemizes aggregate measurable cost savings on the basis of four identified topics: three categories of savings initiatives and another category labeled “overlap” that is intended to account for savings that are double-counted between certain of the savings initiatives.  The table below identifies the four areas, the amount of savings and overlap approved by the Board as contained in its filing, and the amount of savings and overlap that I find to be reasonably supported by the evidence in the record

 

SAVINGS
INITIATIVES

AMOUNT BOARD
APPROVED

AMOUNT FOUND
REASONABLY SUPPORTED

Hospital Savings Initiatives

$119.4 million

$40 million

Uninsured / Underinsured Savings Initiatives

$23.6 million

$6.1 million

Medical Loss Ratio

$6.6 million

$6.6 million

Overlap (resulting in a reduction in the savings amount)

$0

($4.0 million)

TOTAL

$149.6 million

$48.7 million

A. Legal Issues

As explained in the Years One, Two, and Three Decisions of the Superintendent of Insurance, my statutory responsibility in this proceeding is limited to determining whether the “aggregate measurable cost savings filed by the board are reasonably supported by the evidence in the record.”  24‑A M.R.S.A. § 6913(1)(C).  In making this determination, my authority is limited to “issu[ing] an order approving, in whole or in part, or disapproving the filing.”  Id.  Thus, I do not sit as an appellate tribunal with the authority to review the Board’s interpretations of law.  Although the payor intervenors (MEAHP, the Trust, the Chamber, and Anthem) have argued for me to make certain legal interpretations regarding the Board’s actions – for example, whether the Board’s inclusion of the Medical Loss Ratio initiative in its cost savings determination is permissible under the Dirigo laws – these issues are beyond my statutory authority and beyond the scope of this proceeding.  As consistently explained by the Superintendent through the course of the previous three years’ annual review proceedings, I have not been granted the power by the Legislature to review the legal interpretations made by the Board.  The breadth of my legal authority in these proceedings is prescribed by statute.  My charge is to review the record to determine whether the evidence reasonably supports the aggregate measurable cost savings determined by the Board.  This limited statutory jurisdiction over the Board’s cost savings determinations does not invest me with the powers of the judicial branch, in this instance to rule on the legality of substantive decisions made by the Board, a separate executive agency, under its separate statutory responsibilities.  Thus, as in previous years’ proceedings, I confine my review to analyzing whether the amounts and methodologies used by the Board for determining aggregate measurable cost savings are reasonably supported by the evidence in the record.  When claims are raised that the Board exceeded its authority or erred as a matter of law in determining the meaning of the phrase “aggregate measurable cost savings” and in creating a methodology to implement that interpretation, the task of resolving those claims is for the courts.  See, generally, Maine Association of Health Plans v. Superintendent of Insurance, 2007 ME 69.

B. The Board’s Determination of Aggregate Measurable Cost Savings

To assist it in developing a methodology for calculating aggregate measurable cost savings, DHA retained the consulting firm of schramm-raleigh Health Strategy (“srHS”).  The recommendations by srHS were presented in a document entitled Report to the Dirigo Health Agency, Dirigo Health Act:  Aggregate Measurable Cost Savings (AMCS) for Year 4, Updated June 26, 2008 (the “srHS Report”).  The srHS Report determined the aggregate measurable cost savings to total $190.2 million.2 The Board adopted all of the srHS savings initiative categories and the finding that this year’s calculations required no overlap adjustment, but rejected aspects of the savings calculations from the hospital and uninsured / underinsured initiatives, thereby approving a savings amount of $149.6 million.

1. Hospital Savings Initiatives.  Board determination:  $119.4 million.  Amount the Superintendent finds reasonably supported by the evidence:  $40 million.

Background

The hospital savings initiative component of the Board’s filing seeks to measure savings resulting from the cost containment initiatives established in connection with the Dirigo program.  This requires a comparison of actual cost levels to an estimate of what would have occurred had the Dirigo program not been implemented – a concept referred to by researchers as “the counterfactual.”  Expressed in mathematical terms, Savings = Counterfactual – Actual, and the principal task at hand is to review the data on the record and the associated calculations of the counterfactual.  Such a task is inherently difficult and imprecise.  Perhaps for that reason, the standard of review the Superintendent is required by law to apply is whether the evidence on the record reasonably supports the Board’s findings.  By necessity, the Superintendent’s task in this proceeding becomes determining whether a reasonably supported counterfactual is provided in the information placed on the record.  Such a standard by its very nature implies that significant judgment is involved in making the determination of the reasonableness of the savings amount.  No single formula or model should be expected to produce an answer.  The totality of the evidence must be reviewed and a judgment made as to whether the recommended savings have reasonable support.

In the proceeding, the inherent difficulty of the task has been exacerbated by incomplete documentation of the data, methods, interpretations, and assumptions made by DHA and its consultant.  The payor intervenors argue that the Board exceeded the discretion granted to it by statute by relying heavily on recommendations from DHA and its consultant that were based on poor practices and highly questionable methodology.  The indiscriminate nature of the payor intervenors’ criticisms, however, has only served to muddy the waters further, as many of their charges have been made without substantiation for their materiality or their applicability to the measurement task at hand, disregarding readily available information that would allow them to assess the degree to which specific criticisms, if true, would actually affect the estimated savings.  Furthermore, many of the attacks leveled at DHA and its Board by the payor intervenors are, in effect, criticisms of the law, which both the Board and the Superintendent of Insurance are obligated to follow.

As a result of the difficult statutory task and the manner in which it has been conducted by the parties to the proceeding, the record is composed of a myriad of dramatically varying assertions, in some cases unsupported, in others supported with difficult-to-interpret technical details.  Fortunately, a significant amount of the data required to assess the level of savings is contained in the record, and the Superintendent and her staff and consultants have been able to work through this information to make judgments about the degree to which the recommended savings find reasonable support.

The “CMAD” Approach

In the SOP proceedings, the estimates of the savings related to voluntary cost controls by hospitals have centered around the concept of a cost or expense per case-mix adjusted discharge.  The numerator of this measure is an adjusted version of total hospital expense, and the denominator is an estimate of hospital services provided, expressed in patient discharge equivalents.  That denominator is called “case-mix adjusted discharges,” or CMADs, and consists of the sum of inpatient discharges (adjusted by an index of inpatient case mix intensity) and an estimated discharge equivalent for outpatient service activity (defined as outpatient charges divided by the quotient of inpatient charges and inpatient admissions).  In many cases the expense per CMAD (total expenses divided by CMADs) has been referred to as “CMAD,” confusing the output denominator with the expense per output measure that is the focus of the analysis.  In an effort to maintain this distinction and clarify the analysis, in this decision expense per CMAD will be labeled ECMAD.

The payor intervenors have raised the issue as to whether the definition of ECMAD as used in the srHS Report, which is based on the same formula used in prior measurable cost savings proceedings, follows the statutory definition, as set forth in the “Act To Implement Certain Recommendations of the Commission To Study Maine's Community Hospitals,” P.L. 2005, ch. 394, § 4(1)(B).3 The difference is primarily in the manner in which outpatient activity is converted to a discharge equivalent, with the statutory definition relying on inpatient and outpatient revenue rather than inpatient and outpatient charges.  No information has been placed on the record containing the required components of separated inpatient and outpatient hospital revenue, and so the impact of including such a measure cannot be determined from the record.  Thus, as a factual matter, the srHS formula for calculating ECMAD is a measure of hospital costs that is reasonably supported by this record; the question of whether it is inadmissible as a matter of law is outside the jurisdiction of the Superintendent.

Analytically, revenue is a measure preferable to charges since it is less subject to distortions related to changes in charge levels (i.e., nominal prices) that are not directly related to changes in actual cost or revenue of the hospitals.4 However, this concern is less relevant for this proceeding given that the data and analysis on the record include comparison to ECMADs in other states.  To the extent that phenomena like increasing charge levels influence the measurement of ECMAD, the impact on the analytical results will be much less material in a multistate analysis than in a Maine-only analysis, as long as these factors change in similar ways over time across states.

Although the use of ECMAD as the relevant measure was established early in the process that surrounded the formulation and implementation of the Dirigo Act, it is certainly not the only measure of hospital costs that could be analyzed.  Total cost, for example, is relatively simple and objective.  Each has its advantages and disadvantages.  A per-output measure of cost such as ECMAD has the advantage of having its variable cost component less affected by extraneous factors affecting volume, such as population growth.  It has the disadvantage of rising and falling due to the impact of these same factors on the volume over which fixed costs are spread.  Total cost can grow or shrink in ways that are largely volume driven but are not affected by distortions in output measurement.  Per capita measures might be preferable to both total cost and per-output cost.  However, ECMAD is the only measure that has been analyzed on this record.  The payor intervenors have disputed this measure but have not placed an analysis using different measures on the record.

Clarifying the Question to be Analyzed

It is important to define exactly what question must be answered in assessing AMCS attributed to the hospital savings initiative, particularly because many of the assertions made in the proceeding appear to be focused on different questions.  Given that hospital costs and costs per CMAD rise over time and have continued to rise over time by all measures over the period in question, determining the counterfactual cost figure (against which savings are measured) requires assessing how much more they would have grown in the absence of Dirigo.

Further, any cost reduction due to the voluntary cost control efforts related to Dirigo will be occurring simultaneously with changes in the rate of growth in hospital costs due to other factors (the technical term is the “secular trend” in cost), and at a theoretical level this secular trend could be either increasing the rate of growth in costs or decreasing the rate of growth in costs.  Thus it is possible that any measured reduction to the overall rate of growth in Maine hospital costs over time overstates or understates the impact of Dirigo, depending on the size of the Dirigo-related cost impact and the size and direction of the secular cost trend.  One important task, then, is to in some way attempt to separate the secular trend from the impact of Dirigo.

One can imagine polar-opposite scenarios about the relationship of secular trend and the impact of Dirigo.  It could have been that cost growth in Maine was the highest in the country but still lower than it would have been in the absence of Dirigo.  It also could have been that Maine hospital cost growth had dropped dramatically and by more than in any other state but that this drop was not at all attributable to Dirigo.  DHA argued that all reductions in the rate of cost growth are attributable to Dirigo, while the payor intervenors argued that the existence of reductions in the rate of cost growth in other states implies that reductions in cost growth in Maine cannot have been attributable even in part to Dirigo.  Neither is necessarily true, and they cannot both be true; further evidence and further analysis are essential.  Increases or decreases in cost do not by themselves answer the central counterfactual question:  At what level would hospital costs in Maine have grown in the absence of Dirigo?

The Superintendent’s Prior Decisions and Guidance

Because counterfactual estimates are inherently difficult, complex, and require multiple judgments and assumptions, the quality of those estimates depends on the quality of the information presented.  In order to constitute valid “measurable cost savings” for purposes of these proceedings, it is not enough for savings to exist.  They must be identified and measured in some comprehensible and meaningful way.  The existence of savings is important, but equally important to the savings determination is the quality of the information placed on the record to support the existence of those savings.  Shortcomings in this regard have been a consistent characteristic of the cases made in each year’s proceeding.5

Notwithstanding these shortcomings, the parties have made many assertions about what prior findings of savings imply for the determination of savings in 2007.  However, the level of savings approved in prior years has no direct relevance to the degree to which reasonable evidence exists on this year’s record to substantiate savings.  The savings amounts approved by the Superintendent in prior proceedings are not independent best estimates; they are judgments as to whether the determination of the Board was reasonably supported by the evidence on the record, in whole or in part.  In any of the prior years, it is possible that there existed real savings that were not recognized because insufficient evidence appears on the record to meet the reasonableness standard, either because the evidence demonstrating those savings was not yet in existence, because the evidence had not been introduced into the record, or because its full implications had not been adequately understood at the time.  As a result, any assertion that amounts approved for prior years represent a natural limit on amounts that can be approved in the current proceeding are unfounded.

Additional assertions have been made in this year’s proceeding about the guidance provided in the Superintendent’s prior decisions, and what methodologically should or should not have been done to adhere to that guidance.  In prior years, DHA has put forward a method to calculate hospital savings that has relied on the series of Maine-specific ECMAD numbers in the three years prior to Dirigo’s implementation and comparing the average growth in that period to the average growth in the post-Dirigo period.  The method made one minor adjustment acknowledging the need to control for secular trend, an adjustment for the rate of growth in hospital input prices in each year.6  In last year’s decision the Superintendent made the point that the rate of growth in Maine-only hospital costs in the period ending in 2003 was more relevant for its expected (counterfactual) rate of growth in 2004 than it would be for its expected rate of growth in 2007.  This has been misinterpreted to mean that the pre-Dirigo period in Maine is irrelevant or necessarily misleading for the assessment of 2007 savings.  The pre-Dirigo period declines in direct importance over time, but was neither the only relevant information in 2004 nor irrelevant in 2007.  The further in time we are from the base period ending in 2003, the less the Maine base period information can be relied upon to make the assessment, and the more need there is for comparative information from other states and other cost-influencing factors.

An example illustrates this point.  Imagine two states, State A and State B, with annual rates of growth as indicated in Table 1 below.

Table 1

Year

State A Cost Growth

State B Cost Growth

1

8%

2%

2

9%

3%

3

7%

1%

4

3%

3%

If we know that specific steps were taken to control costs in State A beginning in Year 4 and have additional independent evidence of its impact, the comparison of State A’s change in cost growth in the first three years is a more appropriate comparison to the fourth year growth in State A than to the rate of cost growth in State B in the fourth year.

Now, imagine the passage of three more years.

Table 2

Year

State A Cost Growth

State B Cost Growth

1

8%

2%

2

9%

3%

3

7%

1%

4

3%

3%

5

4%

6%

6

6%

9%

7

7%

11%

 

In the seventh year, the Year 1 through Year 3 experience in State A is less relevant and informative about whether the level of cost growth is lower than the counterfactual than it was in the fourth year.  With the passage of time, the comparison to what has happened in other states becomes more important.  In the example, the increases in State A may still reflect restrained growth if the growth in the comparison group has been even larger.  Thus, a multistate comparison becomes more appropriate and important, because the impact of other forces reflected in the secular trend have more time to impact the cost growth in State A and can be controlled for partially by introducing the comparison.  This comparison is further strengthened if we introduce additional information about cost drivers into the analysis.  For example, if we knew that inflation had driven wage rates up dramatically in Years 5 through 7 in State B but not in State A, the interpretation of the comparison would change.  This illustrates how a multivariate analysis helps to control for the secular trend.  If we have a number of factors that can affect cost growth (which we do), then regression analysis is a framework that allows one to simultaneously control for the many variables that affect cost growth.  If one is compelled by law to assess the counterfactual related to the Dirigo implementation several years after the implementation has begun, one can partially address the increasing effect of secular trend on the cost growth levels by using a multistate, multivariate regression framework.

So, the guidance in last year’s decision was not and is not intended to have the experience before Dirigo in Maine ignored, but to have it supplemented by other information, which is now increasingly important in the assessment framework.  The multistate, multivariate regression approach also has the advantage of making some of the shortcomings associated with ECMAD as the basis for the hospital cost growth calculations less important.  Any factors which are changing in Maine that affect the estimated rate of growth in costs, but which also have a similar impact on other states measured cost growth, or which are controlled for by variables in the regression analysis, will have much less of a distorting effect in the multivariate regression analysis.

Having described the advantages of a multivariate regression, it is necessary to caution that as with any other tool, its advantages are only realized if it is designed and executed appropriately and well.  The lack of reliable documentation, and the lack of evidence in the record that the analysis was executed in accordance with sound professional principles, makes it inconsistent with the Superintendent’s guidance and does not provide the same degree of benefit as an analysis supported by documentation demonstrating that sound methods and reasoning were employed.

Carrying Out the Analysis and Documenting Analytical Evidence for Savings

Analyzing the change in ECMAD over time nationally requires voluminous and detailed source data, and numerous steps to clean, aggregate, calculate, analyze, and interpret those data.  To provide clear evidence that would enable an objective assessment of the conclusions reached, it is essential to document all of these steps clearly, and the reasoning used to draw the conclusions.  As in prior years, however, there are deficiencies in the completeness and clarity of the record before us.  In the Superintendent’s hearing questions, it was requested that DHA point to those places in the record at which these types of information had been provided.  While the response to this question clarified the steps taken by DHA’s consultant, it also makes clear that the information provided on the record by DHA could have and should have been more complete.  For example:

  • “Step 1” is described as “Raw MCR data received from AHD for SFY2000-2007”.  The location cited for this information is “AR 4-65, (DHA Exhibit 3), dha_dataset_20.mdb, Sschramm080519.”  (Dirigo Health’s response to Hearing Questions, page 1).  This refers to a table contained in a Microsoft Access database provided by DHA on CD.  This table contains information apparently received by srHS from AHD (American Hospital Directory) containing Medicare Cost Report (MCR) data.  However, unlike in past years’ proceedings, the fields contained in this table do not contain any reference to their source fields in the MCR, and there is no documentation provided allowing one to make such a crosswalk.  Furthermore, there is no information about what AHD did to the raw CMS MCR file.  Since the “raw” file received from AHD is clearly not the MCR file from CMS, it is not possible to ascertain what exactly AHD did to the file before sending it to srHS; at a minimum, fields were subset and renamed, and it appears that more may have been done.
  • “Step 100” in the response is:  “If outliers are present with tabl_FINAL State Data, the hospital data for that state is checked and adjustments are made to the raw data, reflected in Access table named Sschramm080519 and the process is rerun from the beginning so changes flow through.”  (Dirigo Health’s response to Hearing Questions, page 5).  Relating this back to Step 1 quoted above, according to Step 100, DHA and its consultant directly edited the source data table that it received from AHD, so that the “raw” MCR data on the record, according to DHA, has been edited in such a way that no original table as received from AHD is located on the record, and no audit trail of modifications made to this key source table is available.7
  • Despite the complexity of the analysis and the fact that three different approaches and sets of regressions were performed, Appendix G, “CMAD Calculations,” contains approximately one page of text, which is insufficient to describe the various regression analyses that were done to substantiate the recommended savings levels.

However, it is not only the information provided by DHA that has been found incomplete and lacking in transparency.  The payor intervenors have also in many instances failed to substantiate their own arguments.  For example:

  • Although the witness for the Chamber acknowledged and documented the ability to replicate the regression results produced by srHS, and criticized the failure of DHA to adjust for a variety of alleged shortcomings in the regression model, he did not present standard adjustments to the regression model that address these issues.  Although the Chamber cited lack of time, these adjustments are easy to implement and not particularly time-consuming once the model has been set up.  For example, heteroskedasticity, when present in the data, can potentially inflate the level of statistical significance found in the regression.  The Superintendent’s consultant, after replicating the regression analysis, re-ran the regression in modified form to address the heteroskedasticity, and determined that there was no material change to the results based on this adjustment.  Implementing and evaluating the adjustment for heteroskedasticity took the Superintendent’s consultant less than 30 minutes.
  • Numerous criticisms that the payor intervenors have introduced in prior years’ proceedings which were responded to by the Superintendents’ prior decisions are raised anew in this proceeding, seemingly ignoring the prior decisions and the reasoning provided in them.

In short, both DHA and the payor intervenors have omitted important information and introduced extraneous and misleading information, the net effect of which is to make more difficult the already difficult task of determining AMCS related to the hospital savings initiative.

As in past years, the Superintendent has gone through the task of reviewing the record, and has evaluated the underlying data placed on the record and methods used so as to illuminate that which is not illuminated in the written portion of the record, to the extent possible, and to evaluate the assertions made by both parties that are contained in the written record.

Issues with the Data Used to Perform the Analysis

A central distinction between the analysis done for this year’s proceeding and those done in prior years is the use of a national data set.  This difference required the use of the national Medicare Cost Report data set obtained (indirectly through AHD) from the CMS, representing over 4,000 hospitals, rather than using the detailed actual cost reports from the 36 hospitals located in Maine.  The careful review efforts made by the payor intervenors in prior years were not feasible given this choice.

The payor intervenors’ witnesses testified to the data anomalies found in the srHS MCR data, but did not provide any evidence on the record as to the impact these data issues might have on the estimates produced by srHS.  Given the size of the dataset and the number of observations, DHA’s position that these issues are not material to the results is plausible, but not dispositive based on this record.

There are two specific issues that definitely merit further investigation due to their clear potential for significantly influencing the AMCS results of the analysis.  The first of these issues is the observed growth in ECMAD for Maine hospitals between 2000 and 2001 in this year’s analysis, a key element in the calculation of the counterfactual “baseline” rate of growth.  The figure used in this year’s analysis was 11.6%, as compared to the 4.7% figure used in prior years’ analyses.  This more than doubling of the 2000-2001 growth rate, in data that is sufficiently mature that it should be essentially static, is troubling.  DHA’s consultants ascribe this change vaguely to the change to the national data source described above.  It is a serious shortcoming in their analysis that they have not analyzed this significant inconsistency in their new data source and provided an explanation and/or correction, especially in light of the dramatic impact the new figure has on the results, as discussed below.  This flaw raises serious questions about the credibility of the entire analysis.

The Superintendent will take official notice of the record of the Year Two proceeding for the limited purpose of providing some explanatory background as to the likely source of the difference in the year-to-year growth rate in ECMAD.  A comparison of the figures in the Year Two record with the corresponding data used in this year’s srHS analysis elicits at least two important differences in the data.  One difference is that the analysis conducted in this year’s process by srHS apparently, for the first time, calculated ECMAD on the basis of discharge totals that excluded newborns, although the documentation provided by DHA for Year Four, provides no information substantiating whether that is the case.  The other difference, with a more significant impact on the 2000-2001 growth rate, is that the number of (without newborns) discharges in the current year’s data file for 2000 is significantly higher than the comparable number in the Year Two file – 146,561 as presented this year vs. 142,061 in the Year Two analysis.  This significantly larger inpatient discharge number by itself depresses the cost per CMAD, and this effect is magnified by the role it plays in the outpatient equivalent portion of the CMAD formula:  Outpatient Charges / (Inpatient Charges/Inpatient Discharges).  The (Inpatient Charges/Inpatient Discharges) figure is reduced, increasing the value of the “outpatient discharge equivalent” because these vary inversely.  Since both the inpatient discharges and the outpatient discharges are larger, the total CMAD (adjusted discharges) figure is much larger, reducing the ECMAD (expense per adjusted discharge) figure for 2000 and increasing the rate of growth between 2000 and 2001.  As discussed above, the original CMS cost report dataset is not on the record, and so it cannot be confirmed from the record whether the difference in the discharge totals are due to a revision in the CMS file, or whether some issue with the manipulations performed by AHD and/or srHS in some way caused an error in the data.

The second significant issue with the data is the manner in which the incomplete data for 2007 were adjusted to reflect missing data.  Due to differences in hospital fiscal years and annual filing dates, 14 of the Maine hospitals (and a much larger number of non-Maine hospitals) have only partial-year cost report information for SFY 2007.  The missing data were filled in by srHS by computing a pro-rated and trended total for each component of each hospital.  However, projecting each component of ECMAD separately may lead to inappropriate values.  What is clear is that the overall ECMAD for Maine hospitals in 2007 is quite different depending on which method is used to compute an approximation of a complete year.  The ECMAD figure used in the srHS analysis for Maine is $7,757, which represents an average annual rate of growth compared to 2003 of 4.5%.  Altering the method to use a projection of the total ECMAD amount, rather than projecting the components and recalculating, produces an ECMAD of $7,815 or an average annual rate of growth since 2003 of 4.7%.  Using the raw 2007 data (with partial year data for some hospitals) to calculate the ECMAD produces a value of $7,992, or 5.3% average annual growth since 2003.  Below, we calculate the sensitivity of the AMCS to different assumptions about the 2007 hospital costs.

As discussed in the next section, conducting an analysis of the sensitivity of the results to the issues associated with the 2000 and 2007 ECMAD values is important to arriving at a judgment about the level of savings generated by the hospital savings initiative.

Issues with the Regression Analysis

There were a number of points asserted by the payor intervenors as flaws in the regression analysis, but many were not tested with respect to their impact on the results.  Appendix 1, at the end of this section of the analysis, is a table that summarizes the objections and evaluates the merits of the criticisms.  Because neither DHA nor the payor intervenors provided information for the record about the actual importance and impact on the estimates of the issues that might affect the validity of the regression results, the Superintendent and her consultant as part of the review of the record have examined the data on the record to arrive at independent judgments about the severity and impact of these issues.

To quantify the impact of Dirigo on hospital costs, srHS used nationwide hospital-level data relating to costs and determinants of costs for 2000 through 2007.  This information is also aggregated and adapted to state-level “virtual hospitals.”  They estimate linear (ordinary least squares) regressions modeling ECMAD as a function of key cost determinants and a linear time trend (variable Y).  Their model specification allows for the trend to vary between Maine and other states (in both intercept [variable M] and slope [variable M:Y]) and between before and after 2003 (when Dirigo’s effects were presumed to begin) (intercept effect: variable D, slope effect: variable D:Y).  Finally, they include regressors in their model specification that allow the post-2003 portion of the linear cost trend to vary between Maine and the control states (intercept effect: variable M:D, slope effect: variable M:D:Y).

In their U.S. Hospitals regression, the R2 statistic (the proportion of variation in costs explained by the variables in the model) is 0.43.  They interpret the coefficient on the M:D:Y variable, -32.2, as indicating that in 2007 hospital costs in Maine were $32 per CMAD lower because of Dirigo.  The p-value on this coefficient for the “one-sided test” (an estimate, based on some standard assumptions, of the probability that the coefficient is greater than zero) is 0.45, which is less than 50% but much higher than the significance threshold of 0.05 traditionally used by statisticians.

The payor intervenors argue that the high p-value demonstrates that the results of this regression analysis are so unreliable that the only conclusion that could be drawn from them is that there were no savings.  What the p-value actually signifies, however, is that if this regression analysis were the only information we were relying on, it would be barely more likely than not that there were any savings at all – the margin of error around the “observed” $32.2 reduction in ECMAD is so high, based on the model’s own assumptions, that zero is well within that margin of error.

The fallacy in focusing on p-values in this situation is that p-values are only relevant if the “null hypothesis” is under serious consideration as a competing analysis of the observation.  The null hypothesis is that the effect that is being measured does not really exist, and that the value obtained is entirely the result of random fluctuations in the measurement.  A high p-value means the measurement is too close to zero for the observations and analysis to be sufficient, by themselves, to contradict the null hypothesis in any meaningful way.

In this case, however, there is enough other evidence that savings exist that the null hypothesis is not a stronger competing theory.  What this means is that if the regression were properly specified and properly executed, and if it embodied the only useful information we had, then $32.2 per CMAD would be the single most likely estimate we had for the reduction in ECMAD.

However, that is very different from saying it is a reliable estimate.  It might be the most likely figure, but a wide range of other figures are almost equally likely.  The reason the p-value is so high, after all, is because the margin of error is so wide.  When the Superintendent’s consultant tested a refined version of the srHS model under several different assumptions, the 95% confidence interval was always more than $350 million wide.  That is to say, all of the savings calculations generated by that model have a margin of error of more than plus or minus $175 million, within the range that statisticians consider significant.  It must also be kept in mind that these statistical calculations are based only on a comparison between the observed data and the equation produced by the regression model; they assume that the data entered into the model are valid.  The significant data quality concerns discussed earlier further reduce the usefulness of any estimate generated by the srHS model.

In the srHS Cluster 1 regression, the R2 statistic is 0.98.  The coefficient on the M:D:Y variable is -185.4, and the p-value for the one-sided test that the coefficient is less than zero is 0.055, slightly above the standard criterion.  Again, srHS interpreted the coefficient on the M:D:Y variable as indicating that hospital costs in Maine were $185 per CMAD lower than they would have been if not for the passage of the Dirigo Act.  Finally, they combine the two estimates of Dirigo’s effect on hospital costs by assigning a weight of 75% to the US Hospitals estimate ($32) and 25% to the Cluster 1 estimate ($185), arriving at an estimate of costs savings of $70 per CMAD.

Starting with the srHS state-level data, and a replication of the results, several issues with the srHS approach were investigated by the Superintendent’s consultant using the data on the record.  Investigating the potential impact of the issues identified requires using the data on the record to adjust the econometric modeling approach in several ways:

  • Using the full state-level data on a national basis, rather than hospital-level data on a national basis (srHS “US Hospital Model”) or a subset of national state-level observations (srHS Cluster 1 and Cluster 2 models).
  • Using the full set of control variables to account for hospital characteristics that vary by state and may affect hospital costs.  The srHS models control for different variables in the hospital-level and state-level regression.  We tested the srHS models with a specification including all variables from both srHS models (a total of 11 covariates).
  • Modifying the srHS modeling approach to correct for a number of analytic simplifications that may lead to erroneous conclusions.
    • First, the srHS approach examines differences in nominal cost levels, while the Superintendent’s consultants analyze average annual growth rates in costs.8
    • Second, the srHS model specification forces trends in costs to be linear on a year-to-year basis.  In contrast, we analyze data from 2000 to 2003 and 2003 to 2007 and compute average annual growth rates during each period using a geometric mean calculation.  Our approach has the benefit of smoothing cost changes during the pre- and post-Dirigo time periods.
    • Third, the srHS interpretation of the Dirigo program effect for 2007 is erroneously based on an examination of the coefficient on their M:D:Y variable (the interaction between Maine (M), a binary indicator variable for whether the year is greater than 2003 (D), and a linear year variable (Y)).  In the srHS model, the net impact of Dirigo in Maine (calculated as the predicted cost level in Maine relative to a hypothetical counterfactual of Maine without Dirigo in 2007) should be the sum of the coefficient on M:D plus seven times the coefficient on M:D:Y.  Even this corrected approach is problematic, as its counterfactual predicted cost in Maine in 2007 without Dirigo is estimated using a linear trend in costs in Maine during 2000 through 2003.9 In contrast, we have taken the straightforward difference-in-differences approach to identifying the effect of Dirigo, comparing the difference in average annual cost growth in Maine at the state level from before Dirigo (2000-2003, the period chosen by srHS) to after Dirigo (2003-2007) with the comparable average before-to-after difference in the rest of the country.  Again, this is a nationwide state-level regression, rather than the nationwide hospital-level regression or clustered subset of states performed by srHS.  Despite our analysis of cost growth rates (instead of levels), our methodology allows us to calculate predicted costs in Maine without Dirigo if average annual costs in Maine had changed from 2000-2003 to 2003-2007 the same as they actually did in the rest of the country – and thus calculate a net impact of Dirigo in 2007 in nominal dollar terms.  This approach captures the full experience of Dirigo and uses a more realistic and appropriate benchmark.
    • Fourth, the srHS approach weights observations by total discharges.  This presumes that in the absence of Dirigo, Maine would have been more like the larger states than the smaller states.  In our approach, we do not weight observations, with the idea that each of the other states represents an equally likely alternative to Maine’s experience in the absence of Dirigo.  That is, each state political unit with its state-level laws is an equally weighted observation in the comparison to the Maine experience.
  • Consistent with econometric best-practices, our models use a heteroskedasticity-robust standard error estimator to ensure that significance tests are not subject to bias from differences in the precision of the cost estimates that vary across states.
  • In our analysis, we compute estimated net effects of Dirigo under a range of alternative assumptions involving data quality, based on issues identified in the data quality discussion above.

Results of Respecifying the srHS Regression Model

The results of these adjustments to the analysis conducted by srHS are presented in the following Table 3, comprised of three separate tables:

 

[See pp. 17 – 19.]

 

Table 3

 

A: srHS Data

B: ReProjected 2007

C: Modified 2000

D: Adjustments
B&C combined

E: Raw 2007

F: Adjustments
C&E combined

 

Maine

US

Maine

US

Maine

US

Maine

US

Maine

US

Maine

US

 

Data

2000 Cost per CMAD

$5,151

$5,355

$5,151

$5,355

$5,489

$5,355

$5,489

$5,355

$5,151

$5,355

$5,489

$5,355

2003 Cost per CMAD

$6,511

$6,260

$6,511

$6,260

$6,511

$6,260

$6,511

$6,260

$6,511

$6,260

$6,511

$6,260

2007 Cost per CMAD

$7,757

$7,421

$7,815

$7,371

$7,757

$7,421

$7,815

$7,371

$7,992

$7,519

$7,992

$7,519

 

Maine-Only Analysis

Average 2000 to 2003

8.1%

5.3%

8.1%

5.3%

5.9%

5.3%

5.9%

5.3%

8.1%

5.3%

5.9%

5.3%

Average 2003 to 2007

4.5%

4.3%

4.7%

4.2%

4.5%

4.3%

4.7%

4.2%

5.3%

4.7%

5.3%

4.7%

2007 Projected from
2003@ pre-growth

$8,898

 

$8,898

 

$8,173

 

$8,173

 

$8,898

 

$8,173

 

2007 Actual

$7,757

 

$7,815

 

$7,757

 

$7,815

 

$7,992

 

$7,992

 

Cost Growth Reduction
Per CMAD

$1,141

 

$1,083

 

$416

 

$358

 

$907

 

$182

 

Total Reduction
(millions)

$323

 

$306

 

$118

 

$101

 

$256

 

$51

 

 

Multistate Analysis

Pre/Post Growth
Difference

-3.6%

-1.0%

-3.5%

-1.2%

-1.4%

-1.0%

-1.2%

-1.2%

-2.9%

-0.7%

-0.6%

-0.7%

Difference in
Differences ME vs. US

-2.6%

 

-2.3%

 

-0.4%

 

0.0%

 

-2.2%

 

0.1%

 

“Without Dirigo”
Growth

7.1%

4.3%

6.9%

4.2%

4.9%

4.3%

4.7%

4.2%

7.5%

4.7%

5.2%

4.7%

“Without Dirigo” Cost

$8,574

 

$8,518

 

$7,869

 

$7,817

 

$8,684

 

$7,972

 

Cost Growth Reduction
Per CMAD

$816

 

$703

 

$112

 

$1

 

$692

 

$(19)

 

Total Reduction
(millions)

$231

 

$199

 

$32

 

$0

 

$196

 

$(5)

 

 

A: srHS Data

B: ReProjected 2007

C: Modified 2000

D: Adjustments
B&C combined

E: Raw 2007

F: Adjustments
C&E combined

 

Maine

US

Maine

US

Maine

US

Maine

US

Maine

US

Maine

US

“Empty” Multivariate Analysis*

Average Growth
2000 to 2003

8.1%

5.3%

8.1%

5.3%

5.9%

5.3%

5.9%

5.3%

8.1%

5.3%

5.9%

5.3%

Average Growth
2003 to 2007

4.5%

4.3%

4.7%

4.2%

4.5%

4.3%

4.7%

4.2%

5.3%

4.7%

5.3%

4.7%

Pre/Post Growth
Difference

-3.6%

-1.0%

-3.5%

-1.2%

-1.4%

-1.0%

-1.2%

-1.2%

-2.9%

-0.7%

-0.6%

-0.7%

Difference in
Differences ME vs. US

-2.6%

 

-2.3%

 

-0.4%

 

0.0%

 

-2.2%

 

0.1%

 

Cost Growth Reduction
Per CMAD

$816

 

$703

 

$112

 

$1

 

$692

 

$(19)

 

Growth Reduction Per
CMAD p-value

0.0860

 

0.1460

 

0.8020

 

0.9970

 

0.1520

 

0.9660

 

Growth Reduction per
CMAD SE

$476

 

$483

 

$446

 

$453

 

$476

 

$446

 

Lower Bound 95%
Estimate

$(117)

 

$(244)

 

$(763)

 

$(886)

 

$(241)

 

$(894)

 

Upper Bound 95%
Estimate

$1,749

 

$1,649

 

$986

 

$889

 

$1,625

 

$855

 

Total Reduction
(millions)

$231

 

$199

 

$32

 

$0

 

$196

 

$(5)

 

Lower Bound 95%
Estimate (millions)

$(33)

 

$(69)

 

$(216)

 

$(251)

 

$(68)

 

$(253)

 

Upper Bound 95%
Estimate (millions)

$494

 

$466

 

$279

 

$251

 

$459

 

$242

 

* Maine indicator, Year 2003 indicator, Year 2007 indicator, M*Yr03, M*Yr07

 

A: srHS Data

B: ReProjected 2007

C: Modified 2000

D: Adjustments
B&C combined

E: Raw 2007

F: Adjustments
C&E combined

 

Maine

US

Maine

US

Maine

US

Maine

US

Maine

US

Maine

US

Multivariate Analysis with Controls**

Average Growth
2000 to 2003

8.8%

5.4%

8.9%

5.5%

6.5%

5.4%

6.6%

5.5%

8.8%

5.3%

6.6%

5.3%

Average Growth
2003 to 2007

3.7%

4.3%

4.0%

4.2%

3.7%

4.3%

4.0%

4.2%

4.3%

4.6%

4.3%

4.6%

Pre/Post Growth
Difference

-5.1%

-1.1%

-4.9%

-1.3%

-2.9%

-1.1%

-2.6%

-1.3%

-4.6%

-0.7%

-2.3%

-0.7%

Difference in
Differences ME vs. US

-4.1%

 

-3.6%

 

-1.8%

 

-1.3%

 

-3.8%

 

-1.6%

 

Cost Growth Reduction
Per CMAD

$1,256

 

$1,110

 

$534

 

$392

 

$1,029

 

$307

 

Growth Reduction Per
CMAD p-value

0.0000

 

0.0020

 

0.1030

 

0.2370

 

0.0010

 

0.1670

 

Growth Reduction per
CMAD SE

349

 

352

 

328

 

331

 

351

 

330

 

Lower Bound 95%
Estimate

$572

 

$420

 

$(109)

 

$(257)

 

$341

 

$(340)

 

Upper Bound 95%
Estimate

$1,940

 

$1,800

 

$1,177

 

$1,041

 

$1,717

 

$954

 

Total Reduction
(millions)

$355

 

$314

 

$151

 

$111

 

$291

 

$87

 

Lower Bound 95%
Estimate (millions)

$162

 

$119

 

$(31)

 

$(73)

 

$96

 

$(96)

 

Upper Bound 95%
Estimate (millions)

$548

 

$509

 

$333

 

$294

 

$485

 

$270

 

** Add control variables for Total Beds, Interns per Bed, Rural indicator, Days Medicare, Uninsured, Wage Index, Critical Access Hospitals, Days Medicaid, Percent Under 100% FPL, Teaching Status, All-Payer CMI.

Column A of Table 3 is based on the original srHS data, with descriptive computations and regression analysis results using the methodological modifications described above.  Because we are using the srHS data, our calculations at this stage have not yet taken into account the impact of the data problems described above on the results.  Addressing each block of column A separately:

  • The first block shows the actual ECMAD values for 2000, 2003, and 2007 for Maine and the U.S. from the srHS data.  From these, average rates of change from 2000-2003, and 2003-2007 can be computed.
  • The second block uses the ECMAD values to calculate the cost difference by subtracting the actual post-Dirigo 2007 costs from the pre-Dirigo (2000-2003) trended values for 2007.  This method shows actual spending below projected by $323 million.
  • The third block uses the available multistate data, but still makes a simple calculation on the descriptive data.  With data from other states, we can now do a “difference in differences” calculation.  It answers the following question – how did Maine’s change from 2000-2003 to 2003-2007 compare to the national change from 2000-2003 to 2003-2007?  The resulting cost difference estimate of $231 million shows that controlling for what happened to hospital costs in other states over the 2000-2007 period reduces the estimated cost difference by approximately $100 million.
  • In the fourth block, an “empty” regression analysis — one in which control variables are not included - is run with the difference in difference in rate of growth approach described above.  This is the regression-based version of the analysis in the third block, and therefore shows the exact same cost difference of $231 million by again controlling for how cost growth has changed in other states, but not controlling for other factors.  With the regression statistics we can calculate associated p-values and confidence intervals associated with the cost difference.
  • In the fifth block, we introduce the full range of additional control variables.  With the additional 11 covariates, we are not just controlling for the way costs have grown in other states, but also for those factors which affect cost levels and cost growth rates.  Controlling for these other factors, the point estimate of the cost growth difference increases to $355 million, which is slightly above the simple, Maine-only trend.

 

In brief, making the analysis multistate reduces the estimated reduction in cost growth because cost growth has declined generally in the United States (but not by as much as in Maine), but adding controls for some specific factors that affect cost growth (e.g., rural indicator, percent Medicaid, etc.) increases the estimated reduction in cost growth.  Roughly speaking, the additional control variables offset the impact of adding the multistate comparison in such a way that the results resemble the simple Maine-only trend comparison.10 This outcome was not knowable without actually performing the analysis, and the result can be viewed with greater confidence given the rigor of the approach, though the confidence is diminished by the problems with the quality of the underlying data.

The Column A regression-based estimate of cost reductions, using the original data, is $355 million.  The 95% confidence interval on this result is between $162 million and $548 million.  That is, the srHS data, unadjusted, produce a cost reduction estimate with a high degree of uncertainty as to the exact amount, but a high degree of statistical significance that it is very large.  We can conclude from this that if the data could be relied upon, a methodologically sound regression analysis would have found significant and large cost reductions.  This suggests that the Cluster models for estimating cost growth reductions may not have been significantly biased relative to a national sample.  Why a national sample at the state level was not one of the analyses presented in the srHS Report is not known, nor is it known why srHS did not include in their report the detailed assessment of the Clusters that could be found in the discovery materials.

Two things must be kept in mind about the results in column A of Table 3.  First, as discussed under “Clarifying the Question to be Analyzed,” the finding of significant reductions in cost growth does not tell us to what degree those reductions are attributable to Dirigo.  Second, the results as just described rely on the srHS dataset as calculated by them, and thus are affected by significant data quality problems.  We first turn to the data quality issues and then discuss what can be reasonably supported by evidence on the record for savings related to Dirigo, given the cost reduction findings.

The Impact of the Key Data Quality Issues on the Results

There are important data issues and modeling issues that require sensitivity analysis in order to gauge the impact on the cost reduction estimate of uncertainties in the input data and of the assumptions embedded in the calculation of the counterfactual “without Dirigo” ECMAD value.  Columns B through F in Table 3 repeat the analysis just described above for different combinations of adjustments related to the data quality issues discussed earlier:

  • Scenario B replaces the 2007 ECMAD values for Maine and the rest of the U.S. with our re-projection of 2007.
  • Scenario C replaces the 2000 ECMAD with a value that makes the 2000 to 2001 cost growth equal to 4.7%, which is the figure used in prior proceedings, because srHS was unable to explain the data anomaly that resulted in the 11.6% figure used this year.
  • Scenario D makes both of the adjustments in B and C.
  • Scenario E replaces the 2007 values for Maine and the rest of the U.S. with the raw 2007 values. That is, ECMAD is calculated from the partial year cost reports, which are equally incomplete for the relevant components needed to calculate ECMAD.
  • Scenario F uses the raw 2007 values from Scenario E and the higher value for the 2000 ECMAD from Scenario C.

The adjustments applied to the 2000 and 2007 values are not in any real sense “precise,” in that we do not know the correct amount by which to adjust the values without a thorough review of the source data beyond what has been placed on the record (i.e., careful construction of a cleaned data set from the Medicare Cost Report file).  In addition, inserting values in for a key variable (e.g., 2000-2001 ECMAD growth) without making a consistent edit to the source data across the entire U.S. source file may lead to results less significant than would be the case with application of consistent edits.  It is clear, however, that the results are sensitive to data quality and that the record does not contain evidence necessary to resolve the data quality issues definitively.

Scenario F is the one that addresses the data issues most directly, adjusting the 2000-2001 cost growth number downward in a way that ignores the anomalous 2000 value and avoiding arbitrary adjustments/projections to the 2007 data by using the raw 2007 data.  The point estimate for cost reduction produced in this scenario is $87 million, and its 95% confidence interval is the range between $270 million and negative $96 million.

Interpreting the Results

The preceding discussion developed an estimate for cost reduction in the post-Dirigo period that has the following properties:

  • The confidence range is extremely wide, which means the estimate, without corroboration from other sources, has little explanatory power.
  • It is an estimate of cost reduction in 2007, which includes some combination of secular trend and Dirigo-related cost savings.
  • It is an estimate based in part on comparison to contemporary performance in other states, which provides some degree of control for secular trend.  However, the difference between the scenarios using the 11.6% 2000-01 growth and the corresponding scenarios using 4.7% growth shows that the model remains highly sensitive to unusually high growth early in the pre-Dirigo period.  This also suggests that the model is sensitive to the choice of a pre-Dirigo base period.
  • The regression itself does not provide information allowing us to parse out the effects of Dirigo and the secular trend, which is likely to include significant regression to the mean, given the high rate of growth in Maine pre-Dirigo and the model’s sensitivity to that rate of growth.

Although this model, taken alone, would suggest that $87 million is the best estimate of post-Dirigo cost reductions, the broad confidence interval and the questions raised about the underlying data mean that a wide range of figures might be equally reasonable.  Therefore, any other evidence in the record that would shed light upon cost reductions needs to be given significant weight.

There is one competing estimate on the record advocated by some payor intervenors.  This estimate was developed by a consultant for the Maine Association of Health Plans, Jack Burke, FSA, MAAA of Milliman.  Mr. Burke’s methodology, which was also used to develop a competing savings estimate in Year Three, used as its starting point the general methodology developed by srHS and approved by the Board in Years One through Three: calculating the counterfactual “without Dirigo” ECMAD estimate by trending the observed Maine ECMAD figures during the base period.  However, Mr. Burke made significantly more conservative assumptions, yielding lower figures.  In Year Three, Mr. Burke found estimated savings of $8.2 million (as compared to the srHS estimate of $88.4 million).  This year, he found estimated savings of $21.2 million.

The flaws in the trending approach underlying the Burke estimate have been exhaustively documented in Years One through Three, and Mr. Burke himself did not offer that figure as a definitive or reliable estimate of the hospital cost savings.  However, the srHS estimate also has little persuasive power, even after the best efforts made to correct for its deficiencies.  The srHS estimates and the Burke estimate, notwithstanding their deficiencies, are the only measurements of aggregate hospital cost savings on the record.

As in years One through Three, DHA again failed to build a conclusive and comprehensive record.  Some intervenors argue that the failure of either estimate to measure savings reliably should equate to a finding of no savings.  However, there is persuasive evidence on the record that savings have been realized, even if the amount of those savings is difficult to calculate with precision.  The Board’s finding must therefore be approved in part and my task is to determine what part can be reasonably supported by the evidence in the record.

Given the lack of definitive information, what would be reasonable is to consider the Burke model as a lower bound and the best available refinement of the srHS model as an upper bound.  It would not be reasonable to give equal or nearly equal weight to the srHS model as to the Burke model, even with the corrections that have been made to the srHS model, because the conservative assumptions underlying the Burke model make an implicit recognition of secular trend that is lacking in the srHS model, and because data quality concerns and absence of documentation are pervasive in the srHS model.  Therefore, I find it reasonable to give two-thirds weight to the Burke estimate of $21.2 million and one-third weight to the regression estimate of $87 million.  Rounding the resulting weighted average down to the nearest $10 million, because attempts at greater precision are not reasonably supported in this record, produces an estimate of $40 million for hospital cost savings.

The greater weight given to the lower estimate can be corroborated by anecdotal evidence, and the persuasive power of anecdotal evidence is not insignificant in light of the low statistical credibility and limited explanatory power of the projections from the models presented by the parties.  Largely absent from this record and from the records of the prior year proceedings is testimony or other information provided by the hospitals that are the subject of this savings initiative.  With the relatively indirect method of measuring changes in publicly reported cost data being the primary analytical strategy, some specific information from the institutions that are the focus of the hospital savings initiatives would be very helpful.  It should be the goal of any future activities aimed at measuring the impact of the hospital savings initiative to give due weight to reports of actual savings by the hospitals themselves, and to consider detailed surveying and/or testimony from these institutions about exactly what they are doing in response to Dirigo and what they estimate it to have saved and how.

One piece of evidence on the record from hospitals this year is DHA Exhibit 7, which contains a quote from a newspaper article authored by Elizabeth Mitchell, Senior Director of Public Policy at MaineHealth.  The Exhibit contains the following specific quote:

“There is ample evidence that the voluntary effort to limit increases in hospital spending and operating margins that began four years ago is one aspect of the Dirigo reforms that is working exactly as Governor Baldacci and those of you who supported the original legislation anticipated.

Keeping faith with the commitments we have made is a high priority for the MaineHealth board and our hospitals’ trustees.  The average annual increase in our member hospitals’ cost per adjusted discharge has been approximately 3.2 percent, which compares favorably to the Dirigo target of 4 percent.

Maine Medical Center has reduced its prices four times during the past three years to reduce its operating margin.  We estimate that these reductions have saved privately insured patients and their employers almost $40 million.”

This statement by a member of senior management at Maine’s largest hospital group (Maine Medical Center, Miles Memorial Hospital, Saint Andrews Hospital, and Stephens Memorial Hospital), is a strong assertion that Dirigo has influenced cost levels and hospital margins (via price reductions), and thus represents further evidence that the savings are real.  In addition, this statement provides some insight into the problem of correcting for regression to the mean and identifying actual Dirigo savings, since it was based on the hospital’s own evaluation of its cost containment efforts rather than an observation of how observed trends compared to some hypothetical baseline.

On the other hand, this figure represents only one hospital group, albeit the state’s largest, and there is no evidence that it was calculated in any rigorous manner.  The true amount might be lower because of the human tendency to overstate, or it might be higher because the $40 million claimed to have been passed along to payors is not necessarily the entire amount of measurable cost savings, or because some savings might have inadvertently overlooked when compiling the report.

Therefore, the most that can be said about the MaineHealth report from a quantitative perspective is that it is not inconsistent with $40 million in annual statewide savings, when we consider the expectation that MaineHealth represents a major proportion of statewide savings and the expectation that more of the observed savings occurred in the third year because cost containment efforts in prior years would have residual effects in later years.  For example, if $20 million of the savings were in 2007 and they represented half of the statewide savings, that would be $40 million statewide.

Conclusion

For these reasons, the Superintendent finds that the Board’s determination of Dirigo-related savings of $119 million are not reasonably supported by the evidence in the record, but that part of that estimate in the amount of $40 million is reasonably supported.


Appendix 1

Brief Analyses of Additional Issues Raised Related to Regression Results

Issue

Citation

Comment

 

 

 

The model does not capture the effect of Dirigo.

Dobson prefiled (p. 4, ln 19)

The “D” term was not intended to capture Dirigo, it simply reflects pre- and post-Dirigo time periods.  However, D*M and Y*D*M capture the impact on costs in Maine from 2004 onward.

The model finds purported “Dirigo–related” savings in other States.

Dobson prefiled (p. 27, ln 12)

Dirigo is at least one significant influence on costs during this period, but not the only one.  By definition approximately half of states will have above average cost growth and half below average cost growth, but reductions in some states are more significant than in others.  Although Dirigo cannot be a significant influence on costs in other states, since it did not exist in other states, that does not mean there were no comparable cost containment efforts or other state-specific and time-specific factors.  The results indicate Maine had below average cost growth when controlling for a variety of factors.  It is difficult to isolate the impact of voluntary hospital cost savings more explicitly; however, the law requires a measurement be made.

The model fails to take into account other factors that reduced costs in Maine from 2004 onward.

 

Other plausible other explanations given the controls in the multivariate model include regression to the mean, but this can coexist simultaneously with a non-zero Dirigo effect.

srHS used the “wrong” value for the year multiplier.

Dobson prefiled (p. 21, ln 22)

This is incorrect.  You cannot recode the year values without recoding them in the underlying data and re-running the regression to get new coefficients.  The values used for this variable simply indicate the passage of one year of time, regardless of their value.

Maine hospitals have not shown efficiency improvements.

Dobson Prefiled (p. 34, ln 10)

This argument is specious because the relevant comparison is not between Maine hospitals and non-Maine hospitals post-2003, but between Maine hospitals in reality post-2003 and Maine hospitals in the hypothetical post-2003 world with cost growth rates like the non-Maine rates in reality.  Thus, showing that Maine hospitals are different from non-Maine hospitals misses the point of the comparison.

The lack of statistical significance on the key terms means we must accept that there is no savings finding.

Maffei prefiled (p. 11, ln 8);
Dobson prefiled (p. 25, ln 5)

This claim that the model is invalid or not supportive of a finding of cost savings because the coefficients are not significant is misleading.  The relevant comparison is whether the total net cost savings figure is significantly different from zero (presumably in the proper direction).  To calculate that, one must calculate a Wald test after each regression of the magnitude and significance of the quantity “M:D + 7*M:Y:D”—which depends on two coefficients, one of which is multiplied by 7.  It is true that there would often be some association between the significance of the key variables and the significance of the Wald test.  This test is not raised on the record, but in our analysis with the data on the record, its findings are not significant, which means that the conclusion may be the same.  The argument that the variables should have been removed is invalid for an evaluation regression.

There is a bias in the results due to omitted variables.

Maffei prefiled, (pp. 16-19)

In pointing to the relationship between employment and costs, Maffei appears to be confused between increased total costs for insurers and increased cost per case for hospitals.  As interpreted previously by the Law Court, the cost per case is an acceptable measure for a savings calculation.  There is no evidence provided on the record of what adding these variables would do.  In addition, there are methodological adjustments to the form of the regression analysis to address omitted variables bias that the consultant did not mention, nor did he present results from such a modification.

Cost per CMAD should have been logged for the regressions.

Dobson prefiled (p. 18, ln 13)

This is theoretically possible but not necessary with aggregate average cost per discharge data.  Individual spending data or individual cost per case data are highly skewed, but aggregate average cost per case data are approximately normally distributed.  Furthermore, OLS regression is fairly robust to skewness in the dependent variable, especially for purposes of hypothesis testing (not as much for forecasting).  Using the srHS data set, we logged this variable and findings are very similar (but still not significant with only that change).

Results should be the same regardless of which sample is used.

Maffei prefiled (p. 26, ln12)

This is not necessarily true.  Given heterogeneity among states in the US, there is no ex ante reason to presume that the comparison of Maine with a US control group will lead to the same estimates as Maine with a particular subset of states.  The challenge is to figure out which is the “best” control group (where the “best” most accurately represents the counterfactual of what Maine would have looked like in the absence of Dirigo).  This has not been demonstrated by srHS or by the payor intervenors.  We have addressed this in a manner similar to the Board’s disregarding of the Cluster model, by running a national state-level regression.

Results are not valid due to autocorrelation and heteroskedasticity.

 

This is a theoretical problem not corrected for and addressed by the payor intervenors despite their replication of the results and the fact that this meant it would be easy to correct/adjust for.  We did make corrections/adjustments to the model to address these issues and found no meaningful change in the results.

Clustering of Hospitals within states creates errors in estimation.

 

This is theoretically possible but not correct.  It is plausible that hospitals in the same states are affected by their state’s conditions/regulations, and this commonality can lead to statistical problems.  If the problem is simply that the error terms are correlated among hospitals within states, the standard errors in the regression can be corrected to account for that.  If the problem is that there are unobservable factors specific to each state that influence costs directly and vary across states, we can control for all of those (time-invariant) factors at once by adding state fixed effects (dummy variables for all but one state) to the regression model specification.  The payor intervenors did not put any attempt to fix this issue on the record despite replicating the results, though it is easy to do so.  We did so and it does not meaningfully change the results.

The regressions have insignificant predictive value.

Thorpe prefiled (pp.4-5);
Maffei prefiled (p. 27)

This is incorrect.  The purpose of the exercise is to evaluate the effect of Dirigo on hospital costs in Maine.  To do that, we need to compare Maine’s actual experience with an estimate of what Maine would have looked like in the absence of Dirigo.  At least in the multistate, multivariate approach, prediction is irrelevant—no one is predicting anything.  The overall explanatory power of the cost per CMAD regressions is relevant, and we can use the R2 statistic to see how much of the variation in cost per CMAD is being explained by a particular model.  Regardless, our ability to explain cost per CMAD is necessarily limited; the only things to be gleaned from the R2 statistic in this case are: (1) whether the model as a whole has any explanatory power, and (2) how much of the variation in cost per CMAD remains unexplained.  The part that is unexplained consists of irreducible variation (that we can never account for and is not important for this analysis) and systematic variation due to factors that we do not observe and cannot account for (but would account for if we had data on them).  We cannot know the proportions of each.

Inflated R2 due to small observations and “overfitting.”

Dobson Prefiled (p. 32, ln 10)

This is incorrect. While the testimony is correct that there are relatively few observations per variable in this model, which could result in overfitting, there is nothing automatic about the ratio of observations to variables and R2.  Overfitting is a potential problem for forecasting, which we are not interested in here.  We are not using these results to predict outside the sample, we are evaluating the effects within the sample (actually a universe of hospitals).  Furthermore, the results were re-run with hospital level data which produced similar results.

The estimated results should have been compared to the actual Maine results, not the predicted Maine results.

Trust pre-hearing brief

This is incorrect.  The proper way to estimate the effect is to use the regression (i.e., “estimated effects” for both).  The whole point of the regression is to control for other factors contributing to costs per CMAD trends (a variety of hospital characteristics), which are not reflected in the “actual” numbers.

 

2. Uninsured / Underinsured Savings Initiatives.  Board determination:  $23.6 million.  Amount the Superintendent finds reasonably supported by the evidence:  $6.1 million.

The Uninsured / Underinsured savings initiative component of the Board’s filing seeks to quantify savings to the health care system that result from the increase in coverage that can be attributed to the Dirigo reforms.  When Maine residents have health coverage, the providers of their medical care will be reimbursed for those services, which will reduce the amount of bad debt incurred and charity care provided.  This will, in turn, reduce the pressure to shift these costs onto private payors and will help reduce the rate of premium increases.  The savings due to these reductions in bad debt and charity care were determined by srHS to be $35.7 million.  The Board adopted this amount in part and approved $23.6 million.

Savings Estimates in the First Three Years

In Years One and Two, DHA’s consultant developed estimates of these savings based on a process that used the total amount of bad debt and charity care reported by Maine hospitals as a starting point, and then developed various assumptions to estimate the reduction in this amount that could be attributed to enrollment of a portion of the uninsured and underinsured population into DirigoChoice and MaineCare.

In Year Three, srHS revised this approach and derived its estimate of this savings from the actual claim costs of members enrolled in Dirigo and in the MaineCare parents expansions.  The result of this approach was characterized by srHS as an estimate of “new money to the healthcare system for those previously uninsured and underinsured that are now enrolled in DirigoChoice or in the MaineCare parents expansion programs.”  This approach used the actual enrollment in these two initiatives and developed assumptions about factors that included the portion of the enrollment that was previously uninsured or underinsured, and their average claim costs, to arrive at an estimate of savings that could be attributed to the Dirigo initiative.  The result of this analysis was an estimate of $14.0 million of savings.  A consultant for the Maine Association of Health Plans, Jack Burke, FSA, MAAA of Milliman, developed an alternative estimate that preserved many of the srHS assumptions and incorporated several additional assumptions, including an assumption about the variable costs to providers of the additional health care services for these newly insured individuals.  Mr. Burke’s analysis resulted in an estimate of $6.3 million.  The Board adopted the $6.3 million estimate developed by Mr. Burke and the Superintendent found that determination reasonably supported by the evidence in last year’s Decision.

The New Methodology in Year Four

This Year, srHS adopted a new approach to estimating the savings in this category.  Their approach attempted to measure the full, global impact of Dirigo on the rate of uninsurance in Maine, rather than measure the impact of covering the specific populations participating in the Dirigo initiatives.  The way srHS11 sought to capture this global impact was through a new methodology, supported by statistical regression models similar to those used to analyze the impact of the hospital cost savings initiative.

Three multistate multivariate regression models were developed by srHS to produce counterfactual estimates of what the rate of uninsurance in Maine would have been in the absence of Dirigo.  The three models were a U.S. model, a Northeast model, and a Maine model.  The regression’s control variables identified by srHS included:

  • Gender
  • Age
  • Race
  • Marital Status
  • Family Size
  • Geographic Location
  • Education
  • Working Status
  • Firm Size
  • Income
  • Medicaid Eligibility
  • State Children’s Health Insurance Program (SCHIP) Eligibility

The srHS Report assumed a pre-Dirigo period of 1999-2002 and used the models to estimate what uninsurance rates in Maine would have been during the post-Dirigo period in the absence of Dirigo.  These counterfactual estimates were then compared to actual uninsurance rates in Maine to determine the impact of Dirigo, as in the methodology used for hospital cost savings.  An additional estimation step was necessary to develop a figure for 2008, because 2006 was the most recent year for which all data were available.  To project the Dirigo impact out to 2008 for the purpose of determining AMCS, srHS assumed that the actual 2002-2006 annual rate of change in uninsurance would persist to 2008.  The counterfactual 2008 uninsurance rate for each regression model was similarly projected based on each model’s predicted annual rate of change from 2002-2006.

The 2008 counterfactual rates of uninsurance projected from the three regression models were compared to the projected “actual” Maine uninsurance rate for 2008 to arrive at three estimates of the reduction in the uninsurance rate that could be attributed to Dirigo.  Each reduction was then multiplied by the projected 2008 Maine under-65 population to arrive at an estimated reduction in the number of uninsured that can be attributed to Dirigo.  This estimate was then multiplied by $893, the estimated cost of uncompensated care per uninsured person, to arrive at an estimate of the total savings attributable to Dirigo in 2008.  The results from the three srHS regression models can be summarized as follows:

Table 4

Projected “Actual” Maine Uninsurance Rate – 2008

9.65%

 

US
Model

NE
Model

Maine
Model

2008 Projected Uninsured %

13.8%

11.3%

12.0%

Projected Reduction in %

4.1%

1.7%

2.3%

Reduction in Uninsured Lives

46,898

19,077

26,458

Savings (in millions) @ $893 per covered life

$41.9

$17.0

$23.6

The srHS Report recommended a savings estimate of $35.7 million for 2008, which represents a 75%/25% weighted average of the U.S. and Northeast models.  The report concluded that a weighted average was appropriate to minimize any potential distortion to the results that might arise from health care reform efforts in other Northeastern states.  (R. at Tab 3-61, pp. 266-67).

Comments of Payor Intervenors

During the hearing before the Board and in subsequent briefs, payor intervenors raised several concerns about the methodology used by srHS, as follows:

  1. Payor intervenors observed that the methodology used by srHS to estimate bad debt and charity care savings in Year Four resulted in estimated savings that were more than five times the amount approved by both the Board and the Superintendent in Year Three, despite the fact that the enrollment in DirigoChoice has declined in the past year. They argued that the amount approved by the Board could not be a reasonable estimate because it represented a nearly four-fold increase over the amount the Superintendent determined to be reasonable in Year Three.  In support of this argument, they cited the Superintendent’s Year Three Decision, which referred to consistency with amounts found reasonably supported in prior years as “One final test of the overall reasonableness.”  (Chamber Brief, August 26, 2008, p. 32).
  2. Payor intervenors noted that the Year Three Decision had encouraged the use of a multistate multivariate regression model to evaluate savings due to CMAD, but had not provided similar direction for this savings initiative.  (BOI hearing, p. 116; R. at Tab 3-61, p. 336).

    Payor intervenors questioned the use of a multistate multivariate regression model when the actual data was available.  (Anthem Brief, August 28, 2008, p. 28).

  3. Payor intervenors commented on the lack of transparency in the regression models constructed to estimate the reduction in the uninsured/underinsured population.  (R. at Tab 1-35, p. 40).
  4. Payor intervenors criticized the inclusion of Year 2003 experience as part of the post-Dirigo period, based on the timing of the enactment of the Dirigo legislation and the enrollment in subsidized DirigoChoice products.  (R. at Tab 3-61, Dirigo Board Hearing, p. 282; R. at Tab 1-29, p. 7).
  5. Payor intervenors criticized the measurement of savings based on projected rates of uninsured in 2008 rather than the most recent actual rates in 2006.  (R. at Tab 1-35, pp. 41-42; R. at Tab 1-29, p. 7).
  6. Payor intervenors argued that the srHS methodology treats three specific and non-recurring initiatives that reduced the number of uninsured Mainers as persistent and permanent sources of negative trend in the rate of uninsured that should be attributed to Dirigo.  (R. at Tab 1-29, p. 8).
  7. Maine Association of Health Plans witness Burke argued that the per capita estimate of $893 for the cost of uncompensated care for previously uninsured individuals should be reduced 36% to account for the portion of those costs that are recovered from other revenue sources, including philanthropy and federal, state, and local governments.  (R. at Tab 1-29, p. 9; R. at Tab 3-61, pp. 337 - 338).

In addition, Mr. Burke provided his own estimate based on the method approved in Year Three.  He updated the numbers used last year based on data in this year’s record.  (R. at Tab 1-29, Burke Exhibit 2, Attachment II).  The resulting estimate of savings was $6.1 million.

Action by the Board

The Board rejected the $35.7 million savings amount recommended by srHS and instead determined that savings were best predicted by the Maine model and adopted $23.6 million.  During their deliberations, several Board members expressed concerns about several aspects of the srHS proposal, including the magnitude of the increase proposed by srHS relative to the amount allowed by the Superintendent last year, the absence of any adjustment for the impact of several past enrollment shocks, and the decision to adopt the changed methodology despite the absence of any direction from the Superintendent to do so.  (R. at Tab 3-62, p. 6-19).

Analysis

The payor intervenors object to the introduction of a multistate multivariate regression model.  Although there was no direction to do so in last year’s Decision, there is no fundamental reason not to use this approach, so long as it is well specified, correctly executed, and well documented.  However, as discussed below, that was not the case this year.

As pointed out by the payor intervenors, the new methodology resulted in estimated savings several times larger than the amount approved last year.  A comment from the Year Three Decision was cited repeatedly in this year’s record.  The Superintendent concluded last year that:

One final test of the overall reasonableness of this result is that it is not inconsistent with the $5.5 million amount found reasonably supported in Year Two, when adjusted for growth in enrollment during the intervening year and for the addition of a new category of saving within this initiative.

The payor intervenors have cited that comment as evidence that the amounts determined for bad debt and charity care savings this year by srHS and the Board are unreasonable.  That is not correct.  “Final test” referred only to its placement in the analysis and did not imply that this was the ultimate test.  An amount significantly different from past approved amounts could be approved if there were adequate support in the record for that determination.

The fact that the new methodology resulted in significantly larger savings does not in itself invalidate the methodology, but it does highlight the need to check the accuracy and reasonableness of the results, both through empirical evidence and verification of the model’s specification.  As explained in the Year Three Decision:

the standard of “reasonably supported by the evidence in the record only requires that the amount approved by the Board be reasonably supported, and the Superintendent is not required to determine that it is the most reasonable amount.  As such, direct comparison between approved savings figures is not straightforward, especially when the judgment as to reasonable support for each of these figures is based on the information in the record during each respective proceeding.  Nevertheless, the magnitude of the difference between the figures approved by the Board for Year Two and Year Three would make it unreasonable to accept the Year Three figure without an understanding of why these estimates are so different.

Checking for accuracy and reasonableness is always prudent, but is crucial when both the methodology and the results are significantly different from those in prior years.  However, there is no evidence on the record that srHS performed any such checks.  The srHS report attributed the large increase produced by the new methodology to the global approach used, stating:

Dirigo has had multiple, overlapping impacts on the health care expenditures in Maine, including reducing premiums and generating more competition for all health care purchasers. These changes increase enrollment into health care insurance by making premiums more affordable. Due to those impacts, the BD/CC is lower in Maine now than it would have been if Dirigo was never signed into law.

However, no empirical evidence was provided to support this assertion.  To check reasonableness, srHS could have investigated the sources of the change in uninsurance rates.  An analysis of the extent of changes in the various categories of insurance (such as private coverage, MaineCare, Medicare, and military coverage) and the impact of Dirigo over time on enrollment in each category might be illuminating.  Census data is available from sources cited in the srHS Report to support this type of analysis.

When DHA’s witness, Dr. Kenneth Thorpe of Emory University, was questioned about this topic at the hearing, the only impacts he cited beyond those considered in prior years were increased private health insurance due to lower rates and increased enrollment in MaineCare due to publicity associated with Dirigo.  (R. at Tab 3-61, p. 320-321).

With regard to private health insurance coverage, there is no indication on the record that srHS looked at enrollment changes or at how those changes compare to other states.  Furthermore, evidence on the record casts doubt on the existence of any increase in private coverage.  The population data and uninsurance rates shown in the srHS Report (R. at Tab 4-64, p. 70) indicate a decrease of about 17,000 in the number of uninsured in Maine from 2002 to 2006.  The MaineCare enrollment numbers provided by Mr. Burke (R. at Tab 1-29, Burke Exhibit 5) indicate an increase of about 52,000 from December 2002 to December 2006.  Thus, increased enrollment in MaineCare is a more plausible explanation for the decrease in the number of uninsured than any increase in private coverage.  Because only a part of the MaineCare increase can be attributed to the Dirigo Act, it is essential to use methods that can distinguish between Dirigo-related MaineCare enrollment and other factors that have led to increased enrollment.

One way to check the accuracy and specification of the regression models would have been to run the models for 2002 and earlier years to see how closely the results matched actual uninsurance rates in Maine.  There is no indication on the record that this was done.  Due to the lack of documentation and detailed explanations for the models, the Superintendent and her staff and consultants were unable to perform this check or to validate the results of the regression models from the information in the record.  However, some indication of the validity of the models can be gleaned from analyzing the results for 2003.

The payor intervenors questioned the inclusion of 2003 in the post-Dirigo period.  The Dirigo Act was not effective until late in the year and enrollment in DirigoChoice and the MaineCare parents expansion did not begin until 2005.  On the other hand, health care reform was certainly in the public eye throughout 2003.  The Superintendent is aware that the “sentinel effect” can affect behavior even before a new law takes effect.  Nonetheless, while there may have been some “Dirigo effect” on the uninsurance rate in 2003, it is unreasonable to attribute all of the 2003 change to Dirigo.  There was a major decline in the rate of uninsurance in 2003 that is largely explained by a Medicaid expansion that began in 2002.  While it is possible that the 2003 enrollment in this expansion was slightly higher than it would have been in the absence of Dirigo, most of it likely would have occurred anyway.  If the srHS models were well-specified, one would expect them to produce 2003 uninsurance rates close to or slightly above the actual value of 11.79%.  Instead, they resulted in much higher values.

Accordingly, the results of these regression models are not sufficiently reliable to serve as a basis for estimating the reduction in the number of uninsured individuals in Maine that can be attributed to Dirigo.

In addition to these serious questions about the reliability of the models, there are also problems with how the model results were used.  First, the 2006 results were projected to 2008 based on the average 2002-2006 rate of change.  There was no analysis, however, as to whether factors affecting the uninsurance rate in 2003-2006 would be likely to continue, or whether other factors that were not present in those years might impact the rate in 2007 and 2008.  Another problem with this projection is that it inflates the effect of any error or bias in the model itself.

Similarly, the actual 2006 uninsurance rate was projected to 2008 based on the average 2002-2006 rate of change, without considering whether factors affecting the uninsurance rate in 2007 and 2008 were likely to resemble those in 2003-2006.  As noted by the payor intervenors, the decrease in the uninsurance rate from 2002 to 2006 resulted largely from three sources: the 2002 implementation of the MaineCare non-categorical adult expansion, the 2005 implementation of the MaineCare parents expansion, and the 2005 implementation of DirigoChoice.  Payor intervenors argued that it is not reasonable to project similar decreases in 2007 and 2008 in the absence of further “shocks” to the system.  In response, Dr. Thorpe testified that “the state is not planning on abolishing those programs.”  (R. at Tab 3-61, p. 276).

In reviewing the record, one must consider, however, the impact on uninsurance trends from changes in enrollment in these programs, not just their continued existence.  In the case of the 2002 MaineCare expansion, the record shows that enrollment peaked in early 2005 and has been declining recently.  (R. at Tab 1‑29, Burke Exhibit 5).  New enrollment in DirigoChoice has been severely limited due to funding issues.  Therefore, it is not reasonable to project decreases in the uninsurance rate at the same pace as the earlier period.

As discussed above, the difference between the projected 2008 uninsurance rates with and without Dirigo was multiplied by an average per capita cost of care of $893, based on 2005 values from the report: “Paying the Premium, The Added Cost of Care for the Uninsured” (R. at Tab 4-65, Document 12), trended to 2008.  Mr. Burke testified that a reduction to this amount is necessary because a portion of the cost of uncompensated care was actually paid by other entities and would not become bad debt or charity care incurred by providers.  He derived a 36% reduction from Maine data at page 32 of the report, and no contrary evidence in the record has been cited.  Furthermore, the report also states that the uncompensated care is based on “what the privately insured would pay, on average, in the state for the same health care services.”  (R. at Tab 4-65, Document 12, p. 13).  To the extent that a portion of the reduction in uninsured rates that can be attributed to Dirigo is related to increases in MaineCare enrollment, there should have been an additional reduction in the assumed per capita cost to reflect the fact that MaineCare reimbursements are lower than private health insurance reimbursements.  There should also have been a small adjustment to reflect another payment category cited in the report, which was other non-patient non‑government sources of revenue, including philanthropy.

Also, the per capita estimates derived from this report were derived from estimated uncompensated care for an assumed 161,000 uninsured Maine residents in 2005.  (R. at Tab 4‑65, Document 12, p. 8).  According to the srHS Report, the actual number of uninsured Maine residents was about 130,000 in 2005.  (R. at Tab 4-64, p. 70).  This large difference casts further doubt on the validity of the assumed $893 per capita cost.

The determination of $23.6 million for savings due to the Uninsured / Underinsured savings initiatives is based on an average per capita cost of care of $893 and a reduction of about 26,500 in the number of uninsured Maine residents that can be attributed to Dirigo.  Neither assumption is reasonably supported by the evidence in the record.  While it may be possible to approximate a reasonably supported amount for the average per capita cost from the record, it is not possible to do so for the number of newly covered Maine residents that can be attributed to Dirigo unless the srHS methodology is discarded and Mr. Burke’s analysis is utilized.

Mr. Burke’s estimate of $6.1 million is based on enrollment of 12,050 in Dirigo and 5,600 in the MaineCare expansion.  (R. at Tab 1-29, Burke Exhibit 2, Attachment II).  From data in Table 1 (p. 70 of the srHS Report), it can be determined that the actual uninsured counts are 139,000 in 2002, 127,000 in 2003, and 122,000 in 2006.  Given the magnitude of these changes, it appears that the enrollment assumed in the Burke report provides a credible explanation for many of the changes.  The limitations of this approach, however, must be kept in mind, as it is not clear that Mr. Burke fully addresses the impact of providing formerly underinsured people with better coverage through MaineCare or through DirigoChoice.  This methodology also fails to consider the potential effects of carriers holding down rates and adjusting their product designs to compete with DirigoChoice, enabling additional people to avoid the ranks of the uninsured and underinsured.  However, there is a lack of evidence in this record demonstrating that those effects have been appreciable in this market at this time.  Thus, despite the deficiencies observed in both the srHS and Burke models, the Burke estimate of Uninsured / Underinsured savings is the only one that is reasonably supported by this record.

Conclusion

Accordingly, the Superintendent finds that the Board’s estimate of $23.6 million is not reasonably supported by the evidence in the record, and that the part of that estimate that is reasonably supported by the evidence in the record is $6.1 million, as reflected in Mr. Burke’s estimate.

3. Medical Loss Ratio.  Board determination:  $6.6 million.  Amount the Superintendent finds reasonably supported by the evidence:  $6.6 million.

This year’s Board filing included, for the first time, an initiative referred to as Medical Loss Ratio, or MLR.  One component of the Dirigo reform legislation was greater oversight over small group pricing and loss ratios.  Dirigo legislation requires small group insurers either to file rates for prior approval by the Superintendent, or to maintain a loss ratio of at least 78% over any 36-month period and to refund any excess premiums to policyholders if needed to achieve the target loss ratio.  24-A M.R.S.A. § 2808‑B(2‑C), enacted by P.L. 2003, ch. 469, § E‑16.

For the first time, a refund has been paid under the terms of this legislation.  The srHS Report includes a report from Aetna Life Insurance Company showing a premium refund of about $6.6 million payable in early 2008, resulting from loss ratios for the three years ended June 30, 2007.  The srHS Report has asserted that this is a savings that can be attributed to Dirigo and has recommended the inclusion of this amount in the Year 4 determination.  The Board has agreed and included this amount.

The payor intervenors have raised several objections to the inclusion of this amount by the Board.  One issue they have raised is that this refund is payable to employers, not health care providers and is therefore, not a “savings to the Maine health care system.”  Another issue raised is recoverability; because the refunds were paid to employers, they will not benefit health care providers.  (Chamber brief, August 8/26/2008, pp. 36-37).  In addition, there was discussion in the pre-filed testimony of Mr. Burke of the possibility that insurers would include a risk charge component in their pricing to account for the additional expense of the premium refund that would be incurred in times of favorable claim levels.  (R. at Tab 1-29, p. 11).  Such a risk charge would offset the savings enjoyed by employers when loss ratios fell below the mandated threshold.  However, there was no evidence in the record that any small group carriers in Maine had included such a risk charge in their pricing.

Any refunds provided pursuant to this provision of the Dirigo Act reduce the cost of health insurance for the employer groups that receive the refund.  Indeed, the amount saved by these groups and the basis for the refunds in the Dirigo Act are not in dispute; the only question is whether savings to particular policyholders that do not necessarily have any connection with reduced costs of health care, and by their nature can never be recovered by insurers, are the kind of “measurable cost savings” contemplated by 24‑A M.R.S.A. § 6913(1)(A).12 That is a matter outside the Superintendent’s jurisdiction, for the reasons explained earlier in the general discussion of legal issues.  The Board has already considered the issues raised by the payor intervenors, and has determined that the inclusion of the refund in its savings determination is appropriate.

Based on these considerations, the Superintendent finds that the amount of $6.6 million determined by the Board for the Medical Loss Ratio initiative is reasonably supported by the evidence in the record.

4. Overlap.  Board determination:  $0.  Amount the Superintendent finds reasonably supported by the evidence:  $4.0 million (this is a negative adjustment).

“Overlap” refers to the potential double-counting of savings among the various initiatives.

Overlap between CMAD and Uninsured / Underinsured Savings Initiatives

The srHS Report asserts that there is no overlap between the Hospital Cost and Uninsured / Underinsured savings initiatives because “the analysis for BD/CC includes only those costs, charges, and discharges that would have existed in the absence of Dirigo as well as in the presence of Dirigo.”  (R. at Tab 4-64 p. 20).  Payor intervenors dispute this.  The Chamber argued that the global approach srHS used for Uninsured / Underinsured “swallows” both Hospital Cost and MLR savings.  (Chamber Brief p. 37).  Since the Uninsured / Underinsured savings found reasonable in this decision are not based on the srHS methodology, this specific dispute is moot.  The question that needs to be answered is whether there is overlap between the Hospital Cost and Uninsured / Underinsured savings found reasonable in this decision.

MEAHP witness Burke asserts that “Any reduction in bad debt at the hospital would in theory reduce the pressure on the cost per case for the remaining, paying customers, and would thus be reflected in the CMAD calculation.”  (R. at Tab 1-29, Burke Exhibit 2, p. 10).  Although the focus on customer impact suggests that Mr. Burke is discussing hospital charges, and the methodologies used in this case analyze the underlying costs, a substantially similar rationale applies to costs.  If bad debt and charity care are included in hospital reports as “expenses,” then all reductions in hospital bad debt and charity care are included, dollar for dollar, within the overall reductions in ECMAD.  This means that all savings in bad debt and charity care under the Uninsured / Underinsured savings were already captured within the Hospital Cost savings initiative, to the extent that those savings are attributable to hospitals.  Because nothing on the record appears to refute this theory, there is overlap to whatever extent uncompensated care is reflected in the numerator of the ECMAD calculation.  There is no indication in the record that the data received from AHD had been adjusted to remove uncompensated care revenue or cost.

When asked about this adjustment, Steven P. Schramm of srHS testified that he did not know whether it had been removed.  (R. at Tab 2‑60 p. 125).  The srHS Report lists further adjustments made to the data after it was received from AHD (pp. 44-45) but does not indicate any reduction to remove uncompensated care.  Therefore, the record would not reasonably support any figure that fails to account for overlap between the Hospital Cost and Uninsured / Underinsured savings, to the extent that the Uninsured / Underinsured savings reflect hospital costs.

Mr. Burke cites sources indicating that 66% of uncompensated care is for hospital care, and they have not been contradicted by any of the parties.  Id.  Applying this factor to the $6.1 million of Uninsured / Underinsured savings results in an overlap of $4.0 million.

Overlap between Uninsured / Underinsured and MLR Savings Initiatives

As discussed above, the Chamber’s argument that the global approach srHS used for Uninsured / Underinsured “swallows” both CMAD and MLR is moot.  MEAHP argues that to the extent the MLR initiative reduces the cost of insurance, it would be reflected in the reduction in the number of uninsured.  However, this does not demonstrate that Uninsured / Underinsured savings include any MLR savings.  The MLR reflects savings to currently insured groups.  To the extent that reduced insurance premiums resulted in newly insured people, that would reflect additional savings that are not included in the MLR component of the filing.

While the Uninsured / Underinsured savings do not include any MLR savings, the reverse is not necessarily true.  The MLR savings result from the amount of medical claims for a block of small group health insurance policies being less than 78% of premium.  To the extent that the amount of claims was reduced due to Uninsured / Underinsured savings that are passed on to insurers, part of the MLR savings could double-count the Uninsured / Underinsured savings, but only if insurers failed to reflect those savings when they set the premiums.  Reducing premiums to reflect anticipated Uninsured / Underinsured savings is required by law, 24‑A M.R.S.A. § 6913(9), and if those savings are reflected accurately, the combined effect on the medical loss ratio of the reduced claims and reduced premiums would be neutral.  The record includes no evidence to the contrary.  Accordingly, there is no evidence of any overlap in this category.

Overlap between CMAD and MLR Savings Initiatives

Similarly, overlap could exist between CMAD and MLR if the MLR savings resulted in part for CMAD savings being passed on to insurers.  However, here again, there would be no overlap if these savings were also reflected in premiums.  Because there is no evidence on the record that any CMAD savings passed on to insurers were not reflected in lower premiums, there is no evidence of any overlap.

Conclusion

Based on this analysis, the Superintendent finds the Board’s determination that there is no overlap not to be reasonably supported by the evidence in the record, and finds instead that the evidence in the record reasonably supports a finding of overlap totaling $4.0 million.

V. ORDER

By reason of the foregoing, the Superintendent ORDERS that the Dirigo Board’s determination of aggregate measurable cost savings is APPROVED IN PART and that $48.7 million of aggregate measurable cost savings approved by the Dirigo Board is found by the Superintendent to be reasonably supported by the evidence in the record.

VI. NOTICE OF APPEAL RIGHTS

This Decision and Order is final agency action of the Superintendent of Insurance within the meaning of the Maine Administrative Procedure Act.  Pursuant to Bureau of Insurance Rule 350, § 18, the time for appeal runs from the issuance of this Part II Decision and Order.  Any party may appeal this Decision and Order to the Superior Court as provided by 24-A M.R.S.A. § 236, 5 M.R.S.A. § 11001, et seq., and M.R. Civ. P. 80C.  Any such party must initiate an appeal within thirty days after receiving this notice.  Any aggrieved non-party whose interests are substantially and directly affected by this Decision and Order may initiate an appeal within forty days after the issuance of this decision.  There is no automatic stay pending appeal; application for stay may be made as provided in 5 M.R.S.A. § 11004.

 

1 The full complement of the Board consists of 9 voting members and 4 ex officio nonvoting members.  24‑A M.R.S.A. § 6904(1).  Four of the voting positions, however, were vacant at the time of the agency’s decision; the members of the Dirigo Board at that time were:  Robert McAfee, M.D., Chair; Jonathan Beal, Esq.; Edward David, M.D.; Mary Anne Turowski; Mary McAleney; Trish Riley ex officio, Director of the Governor’s Office of Health Policy & Finance; Rebecca Wyke ex officio, Commissioner of the Maine Department of Administrative & Financial Services; Anne Head ex officio, (then Acting) Commissioner of the Maine Department of Professional & Financial Regulation; and David Lemoine ex officio, Maine State Treasurer.  Ms. Riley and Ms. Wyke recused themselves and did not participate in the deliberations.

2 The $190.2 million of aggregate measurable cost savings determined by the srHS Report consists of Hospital initiatives of $147.9 million, Uninsured / Underinsured initiatives of $35.7 million, Medical Loss Ratio of $6.6 million, with no reduction from Overlap.

3 A substantially similar definition has now been codified at 22 M.R.S.A. § 1721(1)(B), which took effect July 18, 2008, after the filing of the srHS Report.

4 Hospital charge levels are closely related to hospital revenue if, for example, the hospital is paid a percentage of its charges by a particular payor.  In other cases, however, there might be no relation between revenue and nominal charge, as when the hospital is paid a predetermined fixed fee.

5 Given the sophistication and complexity of a multistate multivariate model, it is highly desirable for the Board to retain its own expert to help it analyze the validity of the results.  Such additional steps would facilitate an early resolution of questions about such factors as assumptions, data, and methodology, and would enhance the record to allow for better public review.

6 Practically speaking, this adjustment had a very small effect on the answer, as input prices (wage rates, prices of supplies, etc.) are not a major determinant of hospital cost trends.

7 Microsoft Access, unlike some more full-featured database software, does not generate a record of manually changed database records.

8 This is implemented by estimating generalized linear models with a log link and gamma family distribution assumption.

9 In effect, even though multistate data are being used in the regression, this approach extrapolates the pre-Dirigo Maine-specific trend as the basis of the post-Dirigo cost savings.

10 The fact that the Cluster regressions produced similar results to our modified regression, even though the srHS specification is essentially based on a projection of Maine-only results, is due to the simple coincidence that the effect of the multistate adjustment roughly washes against the effect of introducing the control variables. There was no reason to know before performing the calculations that this would be the case.

11 The regression models were primarily developed by Dr. Kenneth Thorpe. References to srHS in this Decision and Order should be understood to incorporate Dr. Thorpe’s work, which was included as an appendix to the srHS Report.

12 In Year One, the Superintendent was faced with a similar question when reviewing the question of savings from the Voluntary Underwriting Gain (VUG) initiative, under which insurers were encouraged to limit their underwriting profit.  Although the Superintendent determined in that instance that there were no measurable savings, that conclusion was based entirely on the lack of evidentiary support.  The Superintendent declined to review the Board’s inclusion of VUG within aggregate measurable cost savings for the same jurisdictional reasons as apply to the Board’s inclusion of MLR this year.  See discussion in section IV(A) above.

 

PER ORDER OF THE SUPERINTENDENT OF INSURANCE

Dated: September 30, 2008 ________________________________
MILA KOFMAN, Superintendent

 


> Dirigo Health Filing

 

Last Updated: May 19, 2014