02-001719BID Kpmg Consulting, Inc. vs. Department Of Revenue
 Status: Closed
Recommended Order on Thursday, September 26, 2002.


View Dockets  
Summary: Agency established that flexible standards for evaluation committee`s scoring of ITN proposals were appropriate in ITN initial sage of review, in complex computer system procurement, where evaluators were trained and sophisticated as to Agency`s needs.

1STATE OF FLORIDA

4DIVISION OF ADMINISTRATIVE HEARINGS

8KPMG CONSULTING, INC., )

12)

13Petitioner, )

15)

16vs. ) Case No. 02 - 1719BID

23)

24DEPARTMENT OF REVENUE, )

28)

29Respondent. )

31)

32and )

34)

35DELOITTE CONSULTING, INC., )

39)

40Intervenor. )

42)

43RECOMMENDED ORDER

45Pursuant to notice, this cause came on for formal

54proceeding and hearing before P. Michael Ruff, duly - designated

64Administrative Law Judge of the Division of Administrative

72Hearings. The hearing was conducted on June 24 and 26, 2002, in

84Tallahassee, Florida. The appearances were as follows:

91APPEARANCES

92For Petitioner: Robert S. Cohen, Esquire

98D. Andrew Byrne, Esquire

102Cooper, Byrne, Blue & Schwartz, LLC

1081358 Thomaswood Drive

111Tallahassee, Flori da 32308

115For Respondent: Cindy Horne, Esquire

120Earl Black, Esquire

123Department of Revenue

126Post Office Box 6668

130Tallahassee, Florida 32399 - 0100

135For Intervenor: Seann M. Frazier, Esquire

141Greenburg, Traurig, P.A.

144101 East College Avenue

148Tallahassee, Florida 32302

151STATEMENT OF THE ISSUE

155The issue to be resolved in this proceeding concerns

164whether the Department of Revenue (Department, DOR) acted

172clearly erroneously, contrary to competition, arbitrarily or

179capric iously when it evaluated the Petitioner's submittal in

188response to an Invitation to Negotiate (ITN) for a child support

199enforcement automated management system - compliance enforcement

206(CAMS CE) in which it awarded the Petitioner a score of 140

218points out of a possible 230 points and disqualified the

228Petitioner from further consideration in the invitation to

236negotiate process.

238PRELIMINARY STATEMENT

240On April 22, 2002, The Petitioner KPMG, INC. (KPMG), filed

250a timely, formal written protest of its disqualific ation from

260further consideration by the Respondent in the CAMS CE

269procurement. The Respondent transmitted the Petition to the

277Division of Administrative Hearings for further proceeding, and

285the matter was set for hearing on May 13, 2002. Pursuant to a

298jo int request from all parties, the hearing was continued until

309June 24 and 26, 2002. Deloitte Consulting, Inc. (Deloitte)

318filed a Petition for Intervention which was granted without

327objection, and the formal hearing was conducted as noticed, on

337the above d ates.

341The Petitioner presented the testimony of two witnesses by

350deposition at hearing: James Focht, Senior Manager for KPMG,

359and Michael Strange, Business Development Manager for KPMG, as

368well as the depositions of the evaluators. The Petitioner

377prese nted nineteen exhibits, all of which were admitted into

387evidence. The Respondent presented the testimony of seven

395witnesses: Lillie Bogan, Child Support Enforcement Program

402Director; Randolph A. Esser, Information Systems Director for

410the Department of H ighway Safety and Motor Vehicles; Edward

420Addy, Ph.D., Program Director for Northrup Grumman Information

428Technology; Frank Doolittle, Process Manager for Child Support

436Enforcement Compliance Enforcement; Andrew Michael Ellis,

442Revenue Program Administrator I II for Child Support Enforcement

451Compliance Enforcement; H. P. Barker, Jr., Procurement

458Consultant; and Harold Bankirer, Deputy Program Director for the

467Child Support Enforcement Program. The Respondent presented one

475exhibit, which was admitted into evide nce. No witnesses were

485presented by the Intervenor.

489Upon conclusion of the hearing a transcript was requested,

498and the parties availed themselves of the opportunity to submit

508Proposed Recommended Orders. The Proposed Recommended Orders

515were considered in the rendition of this Recommended Order.

524FINDINGS OF FACT

527Procurement Background :

5301. The Respondent, the (DOR) is a state agency charged

540with the responsibility of administering the Child Support

548Enforcement Program (CSE) for the State of Florida, in

557accordance with Section 20.21(h), Florida Statutes. The DOR

565issued an ITN for the CAMS Compliance Enforcement implementation

574on February 1, 2002. This procurement is designed to give the

585Department a "state of the art system" that will meet all

596Federal a nd State Regulations and Policies for Child Support

606Enforcement, improve the effectiveness of collections of child

614support and automate enforcement to the greatest extent

622possible. It will automate data processing and other decision -

632support functions and allow rapid implementation of changes in

641regulatory requirements resulting from revised Federal and State

649Regulation Policies and Florida initiatives, including statutory

656initiatives.

657CSE services suffer from dependence on an inadequate

665computer system k nown as the "FLORIDA System" which was not

676originally designed for CSE and is housed and administered in

686another agency. The current F LORIDA System cannot meet the

696Respondent's needs for automation and does not provide the

705Respondent's need for management and reporting requirements and

713the need for a more flexible system. The DOR needs a system

725that will ensure the integrity of its data, will allow the

736Respondent to consolidate some of the "stand - alone" systems it

747currently has in place to remedy certain deficiencies of the

757FLORIDA System and which will help the Child Support Enforcement

767system and program secure needed improvements.

7732. The CSE is also governed by Federal Policy, Rules and

784Reporting requirements concerning performance. In order to

791impr ove its effectiveness in responding to its business partners

801in the court system, the Department of Children and Family

811Services, the Sheriff's Departments, employers, financial

817institutions and workforce development boards, as well as to the

827Federal requi rements, it has become apparent that the CSE agency

838and system needs a new computer system with the flexibility to

849respond to the complete requirements of the CSE system.

8583. In order to accomplish its goal of acquiring a new

869computer system, the CSE beg an the procurement process. The

879Department hired a team from the Northrup Grumman Corporation

888headed by Dr. Edward Addy to head the procurement development

898process. Dr. Addy began a process of defining CSE needs and

909then developing an ITN which reflected those needs. The process

919included many individuals in CSE who would be the daily users of

931the new system. These individuals included Andrew Michael

939Ellis, Revenue Program Administrator III for Child Support

947Enforcement Compliance Enforcement; Frank Dool ittle, Process

954Manager for Child Support Enforcement Compliance Enforcement and

962Harold Bankirer, Deputy Program Director for the Child Support

971Enforcement Program.

9734. There are two alternative strategies for implementing a

982large computer system such as CA MS CE: a customized system

993developed especially for CSE or a Commercial Off The Shelf,

1003Enterprise Resource Plan (COTS/ERP). A COTS/ERP system is a

1012pre - packaged software program, which is implemented as a system -

1024wide solution. Because there is no existi ng COTS/ERP for child

1035support programs, the team recognized that customization would

1043be required to make the product fit its intended use. The team

1055recognized that other system attributes were also important,

1063such as the ability to convert "legacy data" a nd to address such

1076factors as data base complexity and data base size.

1085The Evaluation Process :

10895. The CAMS CE ITN put forth a tiered process for

1100selecting vendors for negotiation. The first tier involved an

1109evaluation of key proposal topics. The key t opics were the

1120vendors past corporate experience (past projects) and its key

1129staff. A vendor was required to score 150 out of a possible 230

1142points to enable it to continue to the next stage or tier of

1155consideration in the procurement process. The evalua tion team

1164wanted to remove vendors who did not have a serious chance of

1176becoming the selected vendor at an early stage. This would

1186prevent an unnecessary expenditure of time and resources by both

1196the CSE and the vendor. The ITN required that the vendors

1207provide three corporate references showing their past corporate

1215experience for evaluation. In other words, the references

1223involved past jobs they had done for other entities which showed

1234relevant experience in relation to the ITN specifications. The

1243Depa rtment provided forms to the vendors who in turn provided

1254them to their corporate references that they themselves

1262selected. The vendors also included a summary of their

1271corporate experience in their proposal drafted by the vendors

1280themselves. Table 8.2 o f the ITN provided positive and negative

1291criteria by which the corporate references would be evaluated.

1300The list in Table 8.2 is not meant to be exhaustive and is in

1314the nature of an "included but not limited to" standard. The

1325vendors had the freedom to select references whose projects the

1335vendors' believed best fit the criteria upon which each proposal

1345was to be evaluated.

13496. For the key staff evaluation standard, the vendors

1358provided summary sheets as well as résumés for each person

1368filling a lead ro le as key staff members on their proposed

1380project team. Having a competent project team was deemed by the

1391Department to be critical to the success of the procurement and

1402implementation of a large project such as the CAMS CE. Table

14138.2 of the ITN provided the criteria by which the key staff

1425would be evaluated.

1428The Evaluation Team :

14327. The CSE selected an evaluation team which included

1441Dr. Addy, Mr. Ellis, Mr. Bankirer, Mr. Doolittle and Mr. Esser.

1452Although Dr. Addy had not previously performed the rol e of an

1464evaluator, he has responded to several procurements for Florida

1473government agencies. He is familiar with Florida's procurement

1481process and has a doctorate in Computer Science as well as

1492seventeen years of experience in information technology.

1499Dr . Addy was the leader of the Northrup Grumman team which

1511primarily developed the ITN with the assistance of personnel

1520from the CSE program itself. Mr. Ellis, Mr. Bankirer and

1530Mr. Doolittle participated in the development of the ITN as

1540well. Mr. Bankirer and Mr. Doolittle had previously been

1549evaluators in other procurements for Federal and State agencies

1558prior to joining the CSE program. Mr. Esser is the Chief of the

1571Bureau of Information Technology at the Department of Highway

1580Safety and Motor Vehicles and has experience in similar, large

1590computer system procurements at that agency. The evaluation

1598team selected by the Department thus has extensive experience in

1608computer technology, as well as knowledge of the requirements of

1618the subject system.

16218. Th e Department provided training regarding the

1629evaluation process to the evaluators as well as a copy of the

1641ITN, the Source Selection Plan and the Source Selection Team

1651Reference Guide. Section 6 of the Source Selection Team

1660Reference Guide entitled "Scori ng Concepts" provided guidance to

1669the evaluators for scoring proposals. Section 6.1 entitled

"1677Proposal Evaluation Specification in ITN Section 8" states:

1685Section 8 of the ITN describes the method by

1694which proposals will be evaluated and

1700scored. SST evalua tors should be consistent

1707with the method described in the ITN, and

1715the source selection process documented in

1721the Reference Guide and the SST tools are

1729designed to implement this method.

1734All topics that are assigned to an SST

1742evaluator should receive at the proper time

1749an integer score between 0 and 10

1756(inclusive). Each topic is also assigned a

1763weight factor that is multiplied by the

1770given score in order to place a greater or

1779lesser emphasis on specific topics. (The

1785PES workbook is already set to perfo rm this

1794multiplication upon entry of the score.)

1800Tables 8 - 2 through 8 - 6 in the ITN Section 8

1813list the topics by which the proposals will

1821be scored along with the ITN reference and

1829evaluation and scoring criteria for each

1835topic. The ITN reference points to the

1842primary ITN section that describes the

1848topic. The evaluation and scoring criteria

1854list characteristics that should be used to

1861affect the score negatively or positively.

1867While these characteristics should be used

1873by each SST evaluator, each evaluat or is

1881free to emphasize each characteristic more

1887or less than any other characteristic. In

1894addition, the characteristics are not meant

1900to be inclusive, and evaluators may consider

1907other characteristics that are not listed .

1914. . (Emphasis supplied).

1918The preponderant evidence demonstrates that all the evaluators

1926followed these instructions in conducting their evaluations and

1934none used a criterion that was not contained in the ITN, either

1946expressly or implicitly.

1949Scoring Method :

19529. The ITN used a 0 to 1 0 scoring system. The Source

1965Selection Team Guide required that the evaluators use whole

1974integer scores. They were not required to start at "7," which

1985was the average score necessary to achieve a passing 150 points,

1996and then to score up or down from 7. The Department also did

2009not provide guidance to the evaluators regarding a relative

2018value of any score, i.e., what is a "5" as opposed to a "6" or a

"20347." There is no provision in the ITN which establishes a

2045baseline score or starting point from which the evaluators were

2055required to adjust their scores.

206010. The procurement development team had decided to give

2069very little structure to the evaluators as they wanted to have

2080each evaluator score based upon his or her understanding of what

2091was in the proposal. Within the ITN the development team could

2102not sufficiently characterize every potential requirement, in

2109the form that it might be submitted, and provide the consistency

2120of scoring that one would want in a competitive environment.

2130This open - ended approach is a customary method of scoring,

2141particularly in more complex procurements in which generally

2149less guidance is given to evaluators. Providing precise

2157guidance regarding the relative value of any score, regarding

2166the imposition of a baseline score or sta rting point, from which

2178evaluators were required to adjust their scores, instruction as

2187to weighing of scores and other indicia of precise structure to

2198the evaluators would be more appropriate where the evaluators

2207themselves were not sophisticated, trained and experienced in

2215the type of computer system desired and in the field of

2226information technology and data retrieval generally. The

2233evaluation team, however, was shown to be experienced and

2242trained in information technology and data retrieval and

2250experie nced in complex computer system procurement.

225711. Mr. Barker is the former Bureau Chief of Procurement

2267for the Department of Management Services. He has 34 years of

2278procurement experience and has participated in many procurements

2286for technology systems similar to CAMS CE. He established that

2296the scoring system used by the Department at this initial stage

2307of the procurement process is a common method. It is customary

2318to leave the numerical value of scores to the discretion of the

2330evaluators based upon e ach evaluator's experience and review of

2340the relevant documents. According wider discretion to

2347evaluators in such a complex procurement process tends to

2356produce more objective scores.

236012. The evaluators scored past corporate experience

2367(references) and key staff according to the criteria in Table

23778.2 of the ITN. The evaluators then used different scoring

2387strategies within the discretion accorded to them by the 0 to 10

2399point scale. Mr. Bankirer established a midrange of 4 to 6 and

2411added or subtracted p oints based upon how well the proposal

2422addressed the CAMS CE requirements. Evaluator Ellis used 6 as

2432his baseline and added or subtracted points from there.

2441Dr. Addy evaluated the proposals as a composite without a

2451starting point. Mr. Doolittle started with 5 as an average

2461score and then added or subtracted points. Mr. Esser gave

2471points for each attribute in Table 8.2, for key staff, and added

2483the points for the score. For the corporate reference

2492criterion, he subtracted a point for each attribute the

2501reference lacked. As each of the evaluators used the same

2511methodology for the evaluation of each separate vendor's

2519proposal, each vendor was treated the same and thus no specific

2530prejudice to KPMG was demonstrated.

2535Corporate Reference Evaluation :

253913. KPMG submitted three corporate references: Duke

2546University Health System (Duke), SSM Health Care (SSM), and

2555Armstrong World Industries (Armstrong). Mr. Bankirer gave the

2563Duke reference a score of 6, the SSM reference a score of 5 and

2577the Armstrong refere nce a score of 7. Michael Strange, the KPMG

2589Business Development Manager, believed that 6 was a low score.

2599He contended that an average score of 7 was required to make the

2612150 - point threshold for passage to the next level of the ITN

2625consideration. There fore, a score of 7 would represent minimum

2635compliance, according to Mr. Strange. However, neither the ITN

2644nor the Source Selection Team Guide identified 7 as a minimally

2655compliant score. Mr. Strange's designation of 7 as a minimally

2665compliant score is no t provided for in the specifications or the

2677scoring instructions.

267914. Mr. James Focht, Senior Manager for KPMG testified

2688that 6 was a low score, based upon the quality of the reference

2701that KPMG had provided. However, Mr. Bankirer found that the

2711Duke r eference was actually a small - sized project, with little

2723system development attributes, and that it did not include

2732information regarding a number of records, the data base size

2742involved, the estimated and actual costs and attributes of data

2752base conversio n. Mr. Bankirer determined that the Duke

2761reference had little similarity to the CAMS CE procurement

2770requirements and did not provide training or data conversion as

2780attributes for the Duke procurement which are attributes

2788necessary to the CAMS CE procureme nt. Mr. Strange and Mr. Focht

2800admitted that the Duke reference did not specifically contain

2809the element of data conversion and that under the Table 8.2,

2820omission of this information would negatively affect the score.

2829Mr. Focht admitted that there was no information in the Duke

2840Health reference regarding the number of records and the data

2850base size, all of which factors diminish the quality of Duke as

2862a reference and thus the score accorded to it.

287115. Mr. Strange opined that Mr. Bankirer had erred in

2881dete rmining that the Duke project was a significantly small

2891sized project since it only had 1,500 users. Mr. Focht believed

2903that the only size criterion in Table 8.2 was the five million

2915dollar cost threshold, and, because KPMG indicated that the

2924project cost was greater than five million dollars, that KPMG

2934had met the size criterion. Mr. Focht believed that evaluators

2944had difficulty in evaluating the size of the projects in the

2955references due to a lack of training. Mr. Focht was of the view

2968that the evaluat or should have been instructed to make "binary

2979choices" on issues such as size. He conceded, however, that

2989evaluators may have looked at other criteria in Table 8.2 to

3000determine the size of the project, such as database size and

3011number of users. However, the corporate references were

3019composite scores by the evaluators, as the ITN did not require

3030separate scores for each factor in Table 8.2. Therefore,

3039Mr. Focht's focus on binary scoring for size, to the exclusion

3050of other criteria, mis - stated the object ive of the scoring

3062process.

306316. The score given to the corporate references was a

3073composite of all of the factors in Table 8.2, and not merely

3085monetary value size. Although KPMG apparently contends that

3093size, in terms of dollar value, is the critical f actor in

3105determining the score for a corporate reference, the vendor

3114questions and answers provided at the pre - proposal conference

3124addressed the issue of relevant criteria. Question 40 of the

3134vendor questions and answers, Volume II, did not single out

"3144pr oject greater than five million dollars" as the only size

3155factor or criterion.

3158QUESTION: Does the state require that each

3165reference provided by the bidder have a

3172contract value greater than $5 million; and

3179serve a large number of users; and include

3187data conversion from a legacy system; and

3194include training development?

3197ANSWER: To get a maximum score for past

3205corporate experience, each reference must

3210meet these criteria. If the criteria are

3217not fully met, the reference will be

3224evaluated, but will be as signed a lower

3232score depending upon the degree to which the

3240referenced project falls short of these

3246required characteristics.

3248Therefore, the cost of the project is shown to be only one

3260component of a composite score.

326517. Mr. Strange opined that Mr. Banki rer's comment

3274regarding the Duke reference, "little development, mostly SAP

3282implementation" was irrelevant. Mr. Strange's view was that the

3291CAMS CE was not a development project and Table 8.2 did not

3303specifically list development as a factor on which prop osals

3313would be evaluated. Mr. Focht stated that in his belief

3323Mr. Bankirer's comment suggested that Mr. Bankirer did not

3332understand the link between the qualifications in the reference

3341and the nature of KPMG's proposal.

334718. Both Strange and Focht beli eve that the ITN called for

3359a COTS/ERP solution. Mr. Focht stated that the ITN references a

3370COTS/ERP approach numerous times. Although many of the

3378references to COTS/ERP in the ITN also refer to development,

3388Mr. Strange also admitted that the ITN was op en to a number of

3402approaches. Furthermore, both the ITN and the Source Selection

3411Team Guide stated that the items in Table 8.2 are not all

3423inclusive and that the evaluators may look to other factors in

3434the ITN. Mr. Bankirer noted that there is no current CSE

3445COTS/ERP product on the market. Therefore, some development

3453will be required to adapt an off - the - shelf product to its

3467intended use as a child support case management system.

347619. Mr. Bankirer testified that the Duke project was a

3486small - size project w ith little development. Duke has three

3497sites while CSE has over 150 sites. Therefore, the Duke project

3508is smaller than CAMS. There was no information provided in the

3519KPMG submittal regarding data base size and number of records

3529with regard to the Duke p roject. Mr. Bankirer did not receive

3541the information he needed to infer a larger sized - project from

3553the Duke reference.

355620. Mr. Esser also gave the Duke reference a score of 6.

3568The reference did not provide the data base information

3577required, which was the number of records in the data base and

3589the number of "gigabytes" of disc storage to store the data, and

3601there was no element of legacy conversion.

360821. Dr. Addy gave the Duke reference a score of 5. He

3620accepted the dollar value as greater than five million dollars.

3630He thought that the Duke Project may have included some data

3641conversion, but it was not explicitly stated. The Duke customer

3651evaluated training so he presumed training was provided with the

3661Duke project. The customer ratings for Duke we re high as he

3673expected they would be, but similarity to the CAMS CE system was

3685not well explained. He looked at size in terms of numbers of

3697users, number of records and database size. The numbers that

3707were listed were for a relatively small - sized project . There

3719was not much description of the methodology used and so he gave

3731it an overall score of 5.

373722. Mr. Doolittle gave the Duke reference a score of 6.

3748He felt that it was an average response. He listed the number

3760of users, the number of locations , that it was on time and on

3773budget, but found that there was no mention of data conversion,

3784database size or number of records. (Consistent with the other

3794evaluators). A review of the evaluators comments makes it

3803apparent that KPMG scores are more a pro duct of a paucity of

3816information provided by KPMG corporate references instead of a

3825lack of evaluator knowledge of the material being evaluated.

383423. Mr. Ellis gave a score of 6 for the Duke reference.

3846He used 6 as his baseline. He found the required elements but

3858nothing more justifying in his mind raising the score above 6.

386924. Mr. Focht and Mr. Strange expressed the same concerns

3879regarding Bankirer's comment, regarding little development, for

3886the SSM Healthcare reference as they had for the Duke H ealth

3898reference. However, both Mr. Strange and Mr. Focht admitted

3907that the reference provided no information regarding training.

3915Mr. Strange admitted that the reference had no information

3924regarding data conversionaining and data conversion are

3931crite ria contained in Table 8.2. Mr. Strange also admitted that

3942KPMG had access to Table 8.2 before the proposal was submitted

3953and could have included the information in the proposal.

396225. Mr. Bankirer gave the SSM reference a score of 5. He

3974commented that the SAP implementation was not relevant to what

3984the Department was attempting to do with the CAMS CE system.

3995CAMS CE does not have any materials management or procurement

4005components, which was the function of the SAP components and the

4016SSM reference proc urement or project. Additionally, there was

4025no training indicated in the SSM reference.

403226. Mr. Esser gave the SSM reference a score of 3. His

4044comments were "no training provided, no legacy data conversion,

4053project evaluation was primarily for SAP not KPMG". However, it

4064was KPMG's responsibility in responding to the ITN to provide

4074project information concerning a corporate reference in a clear

4083manner rather than requiring that an evaluator infer compliance

4092with the specifications. Mr. Focht believed that legacy data

4101conversion could be inferred from the reference's description of

4110the project. Mr. Strange opined that Mr. Esser's comment was

4120inaccurate as KPMG installed SAP and made the software work.

4130Mr. Esser gave the SSM reference a score of 3 beca use the

4143reference described SAP's role, but not KPMG's role in the

4153installation of the software. When providing information in the

4162reference SSM gave answers relating to SAP to the questions

4172regarding system capability, system usability, system

4178reliabilit y but did not state KPMG's role in the installation.

4189SAP is a large enterprise software package. This answer created

4199an impression of little KPMG involvement in the project.

420827. Dr. Addy gave the SSM reference a score of 6.

4219Dr. Addy found that the si ze was over five million dollars and

4232customer ratings were high except for a 7 for usability with

4243reference to a "long learning curve" for users. Data conversion

4253was implied. There was no strong explanation of similarity to

4263CAMS CE. It was generally a s mall - sized project. He could

4276reason some similarity into it, even though it was not well

4287described in the submittal.

429128. Mr. Doolittle gave the SSM reference a score of 6.

4302Mr. Doolittle noted, as positive factors, that the total cost of

4313the project w as greater than five million dollars, that it

4324supported 24 sites and 1,500 users as well "migration from a

4336mainframe." However, there were negative factors such as

4344training not being mentioned and a long learning curve for its

4355users. Mr. Ellis gave a sco re of 6 for SSM, feeling that KPMG

4369met all of the requirements but did not offer more than the

4381basic requirements.

438329. Mr. Strange opined that Mr. Bankirer, Dr. Addy and

4393Mr. Ellis (evaluators 1, 5 and 4) were inconsistent with each

4404other in their eval uation of the SSM reference. He stated that

4416this inconsistency showed a flaw in the evaluation process in

4426that the evaluators did not have enough training to uniformly

4436evaluate past corporate experience, thereby, in his view,

4444creating an arbitrary evaluat ion process.

445030. Mr. Bankirer gave the SSM reference a score of 5,

4461Ellis a score of 6, and Addy a score of 6. Even though the

4475scores were similar, Mr. Strange contended that they gave

4484conflicting comments regarding the size of the project.

4492Mr. Ellis stated that the size of the project was hard to

4504determine as the cost was listed as greater than five million

4515dollars and the database size given, but the number of records

4526was not given. Mr. Bankirer found that the project was low in

4538cost and Dr. Addy s tated that over five million dollars was a

4551positive factor in his consideration. However, the evaluators

4559looked at all of the factors in Table 8.2 in scoring each

4571reference. Other factors that detracted from KPMG's score for

4580the SSM reference were: simil arity to the CAMS system not being

4592explained, according to Dr. Addy; no indication of training (all

4602of the evaluators); the number of records not being provided

4612(evaluator Ellis); little development shown (Bankirer) and

4619usability problems (Dr. Addy). Mr. Strange admitted that the

4628evaluators may have been looking at other factors besides the

4638dollar value size in order to score the SSM reference.

464831. Mr. Esser gave the Armstrong reference a score of 6.

4659He felt that the reference did not contain any data base

4670information or cost data and that there was no legacy conversion

4681shown. Dr. Addy also gave Armstrong a score of 6. He inferred

4693that this reference had data conversion as well as training and

4704the high dollar volume which were all positive factors. H e

4715could not tell, however, from the project description, what role

4725KPMG actually had in the project. Mr. Ellis gave a score of 7

4738for the Armstrong reference stating that the Armstrong reference

4747offered more information regarding the nature of the project

4756than had the SSM and Duke references. Mr. Bankirer gave KPMG a

4768score of 7 for the Armstrong reference. He found that the

4779positive factors were that the reference had more site locations

4789and offered training but, on the negative side, was not specific

4800reg arding KPMG's role in the project.

480732. Mr. Focht opined that the evaluators did not

4816understand the nature of the product and services the Department

4826was seeking to obtain as the Department's training did not cover

4837the nature of the procurement and the pro ducts and services DOR

4849was seeking. However, when he made this statement he admitted

4859he did not know the evaluators' backgrounds. In fact, Bankirer,

4869Ellis, Addy and Doolittle were part of a group that developed

4880the ITN and clearly knew what CSE was seeki ng to procure.

489233. Further, Mr. Esser stated that he was familiar with

4902COTS and described it as a commercial off - the - shelf software

4915package. Mr. Esser explained that an ERP solution or Enterprise

4925Resource Plan is a package that is designed to do a seri es of

4939tasks, such as produce standard reports and perform standard

4948operations. He did not believe that he needed further training

4958in COTS/ERP to evaluate the proposals. Mr. Doolittle was also

4968familiar with COTS/ERP and believed, based on the amount of

4978fu nding, that it was a likely response to the ITN.

498934. Dr. Addy's doctoral dissertation research was in the

4998area of software re - use. COTS is one of the components that

5011comprise a development activity and re - use. He became aware

5022during his research of how COTS packages are used in software

5033engineering. He has also been exposed to ERP packages. ERP is

5044only one form of a COTS package.

505135. In regard to the development of the ITN and the

5062expectations of the development team, Dr. Addy stated that they

5072were a menable to any solution that met the requirements of the

5084ITN. They fully expected the compliance solutions were going to

5094be comprised of mostly COTS and ERP packages. Furthermore, the

5104ITN in Section 1.1, on page 1 - 2 states, ". . . FDOR will

5119consider an a pplicable Enterprise Resource Planning (ERP) or

5128Commercial Off the Shelf (COTS) based solution in addition to

5138custom development." Clearly, this ITN was an open procurement

5147and to train evaluators on only one of the alternative solutions

5158would have biased the evaluation process.

516436. Mr. Doolittle gave each of the KPMG corporate

5173references a score of 6. Mr. Strange and Mr. Focht questioned

5184the appropriateness of these scores as the corporate references

5193themselves gave KPMG average ratings of 8.3, 8.2 an d 8.0.

5204However, Mr. Focht admitted that Mr. Doolittle's comments

5212regarding the corporate references were a mixture of positive

5221and negative comments. Mr. Focht believed, however, that as the

5231reference corporations considered the same factors for providin g

5240ratings on the reference forms, that it was inconsistent for

5250Mr. Doolittle to separately evaluate the same factors that the

5260corporations had already rated. However, there is no evidence

5269in the record that KPMG provided Table 8.2 to the companies

5280comple ting the reference forms and that the companies consulted

5290the table when completing their reference forms. Therefore,

5298KPMG did not prove that it had taken all measures available to

5310it to improve its scores. Moreover, Mr. Focht's criticism would

5320impose a requirement on Mr. Doolittle's evaluation which was not

5330supported by the ITN. Mr. Focht admitted that there was no

5341criteria in the ITN which limited the evaluator's discretion in

5351scoring to the ratings given to the corporate references by

5361those corporate reference customers.

536537. All of the evaluators used Table 8.2 as their guide

5376for scoring the corporate references. As part of his

5385evaluation, Dr. Addy looked at the methodology used by the

5395proposers in each of the corporate references to implement the

5405s olution for that reference company. He was looking at

5415methodology to determine its degree of similarity to CAMS CE.

5425While not specifically listed in Table 8.2 as a similarity to

5436CAMS, Table 8.2 states that the list is not all inclusive.

5447Clearly, methodo logy is a measure of similarity and therefore is

5458not an arbitrary criterion. Moreover, as Dr. Addy used the same

5469process and criteria in evaluating all of the proposals there

5479was no prejudice to KPMG by use of this criterion since all

5491vendors were subject ed to it.

549738. Mr. Strange stated that KPMG appeared to receive lower

5507scores for SAP applications than other vendors. For example,

5516evaluator 1 gave a score of 7 to Deloitte's reference for

5527Suntax. Suntax is an SAP implementation. It is difficult to

5537dr aw comparisons across vendors, yet the evaluators consistently

5546found that KPMG references lacked key elements such as data

5556conversion, information on starting and ending costs, and

5564information on database size. All of these missing elements

5573contributed to a reduction in KPMG's scores. Nevertheless, KPMG

5582received average scores of 5.5 for Duke, 5.7 for SSM and 6.3 for

5595Armstrong, compared with the score of 7 received by Deloitte for

5606Suntax. There is only a gap of 1.5 to .7 points between

5618Deloitte and KPMG' s scores for SAP implementations, despite the

5628deficient information within KPMG's corporate references.

5634Key Staff Criterion :

563839. The proposals contain a summary of the experience of

5648key staff and attached résumés. KPMG's proposed key staff

5657person for T esting Lead was Frank Traglia. Mraglia's

5666summary showed that he had 25 - years' experience respectively, in

5677the areas of child support enforcement, information technology,

5685project management and testing. Strange and Focht admitted that

5694Traglia's résumé did not specifically list any testing

5702experience. Mr. Focht further admitted that it was not

5711unreasonable for evaluators to give the Testing Lead a lower

5721score due to the lack of specific testing information in

5731Traglia's résumé. Mr. Strange explained th at the résumé was

5741from a database of résumés. The summary sheet, however, was

5751prepared by those KPMG employees who prepared the proposal. All

5761of the evaluators resolved the conflicting information between

5769the summary sheet and the résumé by crediting the résumé as more

5781accurate. Each evaluator thought that the résumé was more

5790specific and expected to see specific information regarding

5798testing experience on the résumé for someone proposed as the

5808Testing Lead person.

581140. Evaluators Addy and Ellis gave sco res to the Testing

5822Lead criterion of 4 and 5. Mr. Ron Vandenberg (evaluator 8)

5833gave the Testing Lead a score of 9. Mr. Vandenberg was the only

5846evaluator to give the Testing Lead a high score. The other

5857evaluators gave the Testing Lead an average score o f 4.2. The

5869Vandenberg score thus appears anomalous.

587441. All of the evaluators gave the Testing Lead a lower

5885score as it did not specifically list testing experience.

5894Dr. Addy found that the summary sheet listed 25 - years of

5906experience in child support enforcement, information technology,

5913and project management and system testing. As he did not

5923believe this person had 100 years of experience, he assumed

5933those experience categories ran concurrently. A strong

5940candidate for Testing Lead should demonstrate a combination of

5949testing experience, education and certification, according to

5956Dr. Addy. Mr. Doolittle also expected to see testing experience

5966mentioned in the résumé. When evaluating the Testing Lead,

5975Mr. Bankirer first looked at the team skills matri x and found it

5988interesting that testing was not one of the categories of skills

5999listed for the Testing Lead. He then looked at the summary

6010sheet and résumé from Mraglia. He gave a lower score to

6021Traglia as he thought that KPMG should have put forward someone

6032with demonstrable testing experience.

603642. The evaluators gave a composite score to key staff

6046based on the criteria in Table 8.2. In order to derive the

6058composite score that he gave each staff person, Mr. Esser

6068created a scoring system wherein he awarded points for each

6078attribute in Table 8.2 and then added the points together to

6089arrive at a composite score. Among the criteria he rated,

6099Mr. Esser awarded points for CSE experience. Mr. Focht and

6109Mr. Strange contended that since the term CSE ex perience is not

6121actually listed in Table 8.2 that Mr. Esser was incorrect in

6132awarding points for CSE experience in his evaluation.

614043. Table 8.2 does refer to relevant experience. There is

6150no specific definition provided in Table 8.2 for relevant

6159exper ience. Mr. Focht stated that relevant experience is

6168limited to COTS/ERP experience, system development, life cycle

6176and project management methodologies. However, these factors

6183are also not listed in Table 8.2. Mr. Strange limited relevance

6194to experience in the specific role for which the key staff

6205person was proposed. This is a limitation that also is not

6216imposed by Table 8.2. CSE experience is no more or less

6227relevant than the factors posited by KPMG as relevant

6236experience. Moreover, KPMG included a column in its own

6245descriptive table of key staffs for CSE experience. KPMG must

6255have seen this information as relevant if it included it in its

6267proposal as well. Inclusion of this information in its proposal

6277demonstrated that KPMG must have believed CSE experience was

6286relevant at the time its submitted its proposal.

629444. Mr. Strange held the view that, in the bidders

6304conference in a reply to a vendor question, the Department

6314representative stated that CSE experience was not required.

6322Therefore, Mr. Esser could not use such experience to evaluate

6332key staff. Question 47 of the Vendor Questions and Answers,

6342Volume 2 stated:

6345QUESTION: In scoring the Past Corporate

6351Experience section, Child Support experience

6356is not mentioned as a criterion. Would the

6364State be willing to modify the criteria to

6372include at least three Child Support

6378implementations as a requirement?

6382ANSWER: No. However, a child support

6388implementation that also meets the other

6394characteristics (contract value greater than

6399$5 million, serves a large number of users,

6407includes data conversion from a legacy

6413system and includes training development)

6418would be considered "similar to CAMS CE."

6425The Department's statement involved the scoring of corporate

6433experience not key staff. It was inapplicable t o Mr. Esser's

6444scoring system.

644645. Mr. Esser gave the Training Lead a score of 1.

6457According to Esser, the Training Lead did not have a ten - year

6470résumé, for which he deducted one point. The Training Lead had

6481no specialty certification or extensive experi ence and had no

6491child support experience and received no points. Mr. Esser

6500added one point for the minimum of four years of specific

6511experience and one point for the relevance of his education.

652146. Mr. Esser gave the Project Manager a score of 5. The

6533Project Manager had a ten - year résumé and required references

6544and received a point for each. He gave two points for exceeding

6556the minimum required informational technology experience. The

6563Project Manager had twelve years of project management

6571experience for a score of one point, but lacked certification, a

6582relevant education and child support enforcement experience for

6590which he was accorded no points.

659647. Mr. Esser gave the Project Liaison person a score of

66073. According to Mr. Focht, the Project Lia ison should have

6618received a higher score since she has a professional history of

6629having worked for the state technology office. Mr. Esser,

6638however, stated that she did not have four years of specific

6649experience and did not have extensive experience in the field,

6659although she had a relevant education.

666548. Mr. Esser gave the Software Lead person a score of 4.

6677The Software Lead, according to Mr. Focht, had a long set of

6689experiences with implementing SAP solutions for a wide variety

6698of different clients a nd should have received a higher score.

6709Mr. Esser gave a point each for having a ten - year résumé, four

6723years of specific experience in software, extensive experience

6731in this area and relevant education.

673749. According to Mr. Focht the Database Lead had

6746experience with database pools including the Florida Retirement

6754System and should have received more points. Mr. Strange

6763concurred with Mr. Focht in stating that Esser had given low

6774scores to key staff and stated that the staff had good

6785experience, which should have generated more points.

6792Mr. Strange believed that Mr. Esser's scoring was inconsistent

6801but provided no basis for that conclusion.

680850. Other evaluators also gave key staff positions scores

6817of less than 7. Dr. Addy gave the Software Lead p erson a score

6831of 5. The Software Lead had 16 years of experience and SAP

6843development experience as positive factors but had no

6851development lead experience. He had a Bachelor of Science and a

6862Master of Science in Mechanical Engineering and a Master's in

6872B usiness Administration, which were not good matches in

6881education for the role of a Software Lead person.

689051. Dr. Addy gave the Training Lead person a score of 5.

6902The Training Lead had six years of consulting experience, a

6912background in SAP consulting an d some training experience but

6922did not have certification or education in training. His

6931educational background also was electrical engineering, which is

6939not a strong background for a training person. Dr. Addy gave

6950the subcontractor managers a score of 5. Two of the

6960subcontractors did not list managers at all, which detracted

6969from the score. Mr. Doolittle gave the Training Lead person a

69805. He believed that based on his experience and training it was

6992an average response.

699552. Table 8.2 contained an item in which a proposer could

7006have points detracted from a score if the key staff person's

7017references were not excellent. The Department did not check

7026references at this stage in the evaluation process. As a

7036result, the evaluators simply did not consider th at item when

7047scoring. No proposer's score was adversely affected thereby.

705553. KPMG contends that checking references would have

7063given the evaluators greater insight into the work done by those

7074individuals and their relevance and capabilities in the proj ect

7084team. Mr. Focht admitted, however, that any claimed effect on

7094KPMG's score is conjectural. Mr. Strange stated that without

7103reference checks information in the proposals could not be

7112validated but he provided no basis for his opinion that

7122reference ch ecking was necessary at this preliminary stage of

7132the evaluation process. Dr. Addy stated that the process called

7142for checking references during the timeframe of oral

7150presentations. They did not expect the references to change any

7160scores at this point in the process. KPMG asserted that

7170references should be checked to ascertain the veracity of the

7180information in the proposals. However, even if the information

7189in some other proposal was inaccurate it would not change the

7200outcome for KPMG. KPMG would stil l not have the required number

7212of points to advance to the next evaluation tier.

7221Divergency in Scores

722454. The Source Selection Plan established a process for

7233resolving divergent scores. Any item receiving scores with a

7242range of 5 or more was determined to be divergent. The plan

7254provided that the Coordinator identify divergent scores and then

7263report to the evaluators that there were divergent scores for

7273that item. The Coordinator was precluded from telling the

7282evaluator, if his score was the divergent s core, i.e., the

7293highest or lowest score. Evaluators would then review that

7302item, but were not required to change their scores. The purpose

7313of the divergent score process was to have evaluators review

7323their scores to see if there were any misperceptions o r errors

7335that skewed the scores. The team wished to avoid having any

7346influence on the evaluators' scores.

735155. Mr. Strange testified that the Department did not

7360follow the divergent score process in the Source Selection Plan

7370as the coordinator did not tell the evaluators why the scores

7381were divergent. Mr. Strange stated that the evaluator should

7390have been informed which scores were divergent. The Source

7399Selection Plan merely instructed the coordinator to inform the

7408evaluators of the reason why the sco res were divergent.

7418Inherently scores were divergent, if there was a five - point

7429score spread. The reason for the divergence was self -

7439explanatory.

744056. The evaluators stated that they scored the proposals,

7449submitted the scores and each received an e - ma il from Debbie

7462Stephens informing him that there were divergent scores and that

7472they should consider re - scoring. None of the evaluators

7482ultimately changed their scores. Mr. Esser's scores were the

7491lowest of the divergent scores but he did not re - score hi s

7505proposals as he had spent a great deal of time on the initial

7518scoring and felt his scores to be valid. Neither witnesses

7528Focht or Strange for KPMG provided more than speculation

7537regarding the effect of the divergent scores on KPMG's ultimate

7547score and a ny role the divergent scoring process may have had in

7560KPMG not attaining the 150 point passage score.

7568Deloitte - Suntax Reference :

757357. Susan Wilson, a Child Support Enforcement employee

7581connected with the CAMS project signed a reference for Deloitte

7591Co nsulting regarding the Suntax System. Mr. Focht was concerned

7601that the evaluators were influenced by her signature on the

7611reference form. Mr. Strange further stated that having someone

7620who is heavily involved in the project sign a reference did not

7632appea r to be fair. He was not able to state any positive or

7646negative effect on KPMG by Wilson's reference for Deloitte,

7655however.

765658. Evaluator Esser has met Susan Wilson but has had no

7667significant professional interaction with her. He could not

7675recall any thing that he knew about Ms. Wilson that would

7686favorably influence him in scoring the Deloitte reference.

7694Dr. Addy also was not influenced by Wilson. Mr. Doolittle has

7705only worked with Wilson for a very short time and did not know

7718her well. He has also evaluated other proposals where

7727department employees were a reference and was not influenced by

7737that either. Mr. Ellis has only known Wilson from two to four

7749months. Her signature on the reference form did not influence

7759him either positively or negative ly. Mr. Bankirer had not known

7770Wilson for a long time when he evaluated the Suntax reference.

7781He took the reference at face value and was not influenced by

7793Wilson's signature. It is not unusual for someone within an

7803organization to create a reference fo r a company who is

7814competing for work to be done for the organization.

7823CONCLUSIONS OF LAW

782659. The Division of Administrative Hearings has

7833jurisdiction of the subject matter of and the parties hereto.

7843Sections 120.569 and 120.57(1)(3), Florida Statutes (2001).

7850A bid protest proceeding is designed to:

7857[D]etermine whether the agency's proposed

7862action is contrary to the agency's governing

7869statutes, the agency's rules or policies, or

7876the bid or proposal specifications. The

7882standard of proof for such proc eeding shall

7890be whether the proposed agency action was

7897clearly erroneous, contrary to competition,

7902arbitrary, or capricious.

7905Section 120.57(3)(f), Florida Statutes (2001).

791060. While Section 120.57(3)(f), Florida Statutes,

7916describes these proceedings as de novo , the courts have defined

" 7926de novo " for the purposes of a protest to a competitive

7937procurement as a "form of inter - agency review. The

7947Administrative Law Judge may receive evidence as with any formal

7957hearing under Section 120.57(1), Florida Statute s, but the

7966object of the proceeding is to evaluate the action taken by the

7978agency." State Contracting & Engineering Corp. v. Dep't. of

7987Transportation , 709 So. 2d 607, 609 (Fla. 1st DCA 1998) citing

7998Intercontinental Properties, Inc. v. State Dep't. of Heal th and

8008Rehabilitative Svcs. , 606 So. 2d 308 (Fla. 3d DCA 1992).

801861. The party initiating a competitive procurement protest

8026bears the burden of proof. Section 120.57(3)(f), Florida

8034Statutes (2001). Findings of Fact must be based upon a

8044preponderance of the evidence. Section 120.57(1)(j), Florida

8051Statutes (2001).

805362. The standard of proof in a proceeding such as this

8064concerns whether the proposed agency action was clearly

8072erroneous, contrary to competition, arbitrary or capricious.

8079Section 120.57( 3)(f), Florida Statutes.

808463. A capricious action is one taken without thought or

8094reason or irrationally. An arbitrary decision is one not

8103supported by facts or logic. Agrico Chemical Co. v. Dep't of

8114Environmental Regulation , 365 So. 2d 759, 763 (Fla. 1s t DCA

81251978). A decision is clearly erroneous when unsupported by

8134substantial evidence or contrary to the clear weight of the

8144evidence or induced by an erroneous view of the law. Blacks Law

8156Dictionary , Rev. 4th Ed., (1968).

816164. An act is contrary to c ompetition when it offends the

8173purpose of competitive bidding. That purpose has been

8181articulated as follows:

8184[T]o protect the public against collusive

8190contracts; to secure fair competition upon

8196equal terms to all bidders; to remove not

8204only collusion but temptation for collusion

8210and opportunity for gain at public expense;

8217to close all avenues to favoritism and fraud

8225in its various forms; to secure the best

8233values for the [public] at the lowest

8240possible expense; and to afford an equal

8247advantage to all desir ing to do business

8255with the [government], by affording an

8261opportunity for an exact comparison of bids.

8268Wester v Belote , 103 Fla. 976, 138 So. 721, 723 - 4, (Fla. 1931).

8282Harry Pepper & Assoc., Inc. v. City of Cape Coral , 352 So. 2d

82951190, 1192 (Fla. 2d DCA 1 977).

830265. The CAMS CE ITN has a two - tier evaluation process.

8314After a finding of initial responsiveness, the evaluators scored

8323the key proposal topics for the remaining proposals, including

8332KPMG. The key proposal topics were past corporate experience,

8341for which the proposers submitted references from prior

8349projects, and key staff, for which the proposers submitted staff

8359résumés. To advance to the next evaluation tier, a proposer

8369must score at least 150 of 230 points. KPMG scored 140 points

8381and was el iminated from the next round of evaluation and

8392protested that elimination.

839566. The Petitioner objected to the scoring system used in

8405the evaluation as arbitrary and capricious. It had two

8414objections to the scoring system: that evaluators were given

8423too much discretion and there was no established base line

8433score. The ITN established a 0 - 10 scoring system with 0 being

8446poor and 10 being excellent. Within the parameters of that

8456scale the evaluators were not given any further guidance

8465regarding the mea ning of a "5" versus a "7." Nor were the

8478evaluators given a scoring methodology, such as to start from 0

8489and add points or start from 5 and add or subtract points.

8501Hence, the evaluators each developed his own scoring

8509methodology. The Petitioner argued t hat the lack of consistent

8519scoring methodology made the scoring process arbitrary.

8526However, each evaluator used his same scoring methodology for

8535evaluating every proposal. Therefore, all the proposals were

8543evaluated by the same criteria. The scoring met hodology was not

8554arbitrarily or capriciously applied against only the Petitioner

8562and the criteria used were explicitly provided for or implicitly

8572allowed by the specifications of the ITN.

857967. In the evaluation of complex procurements the

8587established and better practice is to employ a scoring system

8597that permits the evaluators to use their own knowledge and

8607experience to evaluate the proposals. The evaluators chosen by

8616the Department had knowledge and skills in computer systems and

8626knowledge of necessary functionalities for a CSE Case Management

8635System. As this is a complex procurement which generated

8644complex and highly technical responses, it would be difficult

8653and counter - productive to set out every possible anticipated

8663computer system and devise a spe cific scoring system to cover

8674all of these solutions in the ITN. Such an approach is the

8686direct opposite of the teams' intent in developing the ITN. The

8697development team was trying to have an open - ended approach to

8709solutions to generate innovative respon ses. It is impossible to

8719cover all the range of acceptable responses in specific scoring

8729standards without inadvertently leaving out relevant criteria.

8736Therefore, the more cogent, rational approach is to trust the

8746discretion of the evaluators who have re levant skill and

8756experience to determine the merits of highly complex proposals

8765such as CAMS CE system, rather than to impose highly detailed,

8776inflexible, vote standards on them.

878168. Table 8.2 established the basic components that a

8790complete proposal s hould contain. The Petitioner asserted that

8799the evaluators should have received more training regarding

8807appropriate scoring of the criteria. However, given the nature

8816of the CAMS CE ITN, in which the nature of the computer system

8829solution was left open t o the proposer, and the complexity of

8841the ITN and proposals, it would be difficult to provide more

8852specific training without creating bias for one solution over

8861another or inadvertently leaving out or precluding another

8869solution.

887069. KPMG did not prote st the specifications of the ITN

8881within the 72 - hour period provided by Section 120.57(3)(f),

8891Florida Statutes. Even though KPMG asserted that the instant

8900case does not involve a protest of the specifications, clearly

8910if, as KPMG contends, the Department e stablished scoring

8919standards through training of evaluators, the scoring standards

8927would be de facto specifications. To the extent that KPMG is

8938challenging what it purports is a lack of sufficiently specific

8948scoring standards or training of the evaluators in scoring

8957methodology, it would appear that KPMG is making an indirect

8967attack on specifications, or the purported lack thereof, the

8976time for which is past and has been waived.

898570. KPMG asserted that the evaluators should have started

8994at a base line of 7 and scored proposals up or down from there.

9008It asserts that an average of 7 is a minimally compliant score.

9020It bases the assertion on the passage score of 150 points out of

9033230 points for a vendor to continue in the evaluation process.

9044In order to sco re 150 points a proposer must score an average of

90587 on the key proposal topics. KPMG's underlying assumption was

9068that the Department was seeking to have all average proposals

9078advance to the next evaluation level; however, it provided no

9088evidence that the ITN, the Source Selection Plan or the Source

9099Selection Training Guide showed that the Department had this

9108goal. To the contrary, as discussed above, those plans and

9118guides as well as the ITN purposely avoided such scoring

9128restrictions. The evidence demon strated that CSE was seeking an

9138optimal solution rather than an average solution; otherwise the

9147cut - off line for key proposal topics would have been the halfway

9160mark or 115 points instead of 150 points.

916871. KPMG asserted that it was unfairly evaluate d due to

9179the evaluators' ignorance of the solutions the ITN was designed

9189to procure. Specifically it posited a COTS/ERP solution which

9198is a commercial software package rather than a system built for

9209CSE compliance enforcement. The ITN stated that either solution

9218was acceptable. KPMG ignored the stated intent of the ITN to

9229accept either solution and instead extrapolated an intent by the

9239Department to seek a COTS/ERP solution from the references to

9249such a solution in the ITN, although Mr. Focht admitted th at

9261many of these references also referred to custom development.

9270Mr. Addy stated that the ITN development team expected to see

9281solutions which were a mixture of COTS/ERP products and

9290customization. Other than speculation, KPMG did not offer any

9299preponder ant evidence that the ITN was seeking purely a COTS/ERP

9310solution.

931172. KPMG based its premise that the evaluators did not

9321understand a COTS/ERP solution on the comments of the evaluators

9331that the KPMG references failed to show any development. As

9341there is no current commercial product on the market which is a

9353CSE Case Management System, some custom development will be

9362required. Indeed, if such a system existed, there would be no

9373need for the instant ITN as the Department would simply purchase

9384that prod uct through a sole - source procurement. Mr. Strange

9395stated that, in his opinion, the evaluators use of the term

"9406development" rather than the term "customization" demonstrated

9413that they did not understand COTS/ERP. The Department, however,

9422offered evidenc e of the evaluators knowledge of COTS/ERP

9431products which was unrefuted.

943573. The backgrounds of the evaluators belied the assertion

9444of their lack of understanding of the ITN or proposed solutions.

9455Dr. Addy has a Doctorate in Information Technology. Hi s

9465dissertation topic was re - usable software of which COTS/ERP is a

9477sub - set. Dr. Addy wrote the ITN and understood the types of

9490solutions that vendors were likely to posit in response to the

9501ITN. Mr. Doolittle, Mr. Ellis and Mr. Bankirer knew the

9511function s they were seeking from a Case Management System. All

9522testified that they were familiar with COTS/ERP solutions.

9530Mr. Esser, as the Head of Information Technology at the

9540Department of Highway Safety and Motor Vehicles, has

9548participated in many informati on technology procurements and was

9557also familiar with COTS/ERP.

956174. KPMG asserts that the evaluators needed more training

9570in COTS/ERP solutions. This position is not borne out by the

9581evidence, when it is considered that other allegations made by

9591KPMG i tself contradict this assertion in that it complained that

9602other similar SAP applications, which were also COTS/ERP, had

9611received higher scores. KPMG did not assert that any customized

9621systems received higher scores. The evaluators thus appeared to

9630under stand COTS/ERP, they just believed that KPMG's references

9639were deficient.

964175. KPMG asserted that the evaluators' scores should have

9650been aligned with the ratings given to KPMG by the corporate

9661references. It stated that as the references had already rat ed

9672the project, the evaluators could not assign a different score

9682to it. The Petitioner, however, offered no bases for its

9692assertion. The evaluators noted the ratings given by the

9701references but expected high ratings from corporate references

9709chosen by t he proposing vendor itself, KPMG. As KPMG had chosen

9721them they presumed they would have chosen references which would

9731give KPMG high ratings. There is no requirement in the ITN that

9743instructed evaluators to conform their evaluation of a reference

9752to the ratings provided by that reference company.

976076. KPMG asserted that the evaluators did not properly

9769score its proposal in regard to the size of the project. Table

97818.2 stated that any project less than five million dollars would

9792negatively affect the scor e of that proposer. KPMG asserted

9802that the scoring for this factor was binary. Since KPMG's

9812references said the project was greater than five million

9821dollars, then KPMG should have gotten the full allotment of

9831points, in its view. It further asserted t hat the cost is the

9844only measure of size according to Table 8.2.

985277. KPMG's assertions ignore the fact that the score was

9862for the entire reference and not just for cost. Even if the

9874evaluators gave full credit to KPMG for having a project that

9885had cos t over five million dollars, the reference score was

9896negatively affected by other factors such as legacy data base

9906conversion, lack of information regarding the size of the

9915database and training.

991878. Several evaluators commented that the Duke and SSM

9927r eferences were small - sized projects. KPMG stated that these

9938comments were inaccurate as costs are the only measure of size

9949and the Duke and SSM references met the cost criterion.

9959However, the evaluators were also looking at the project size in

9970relation t o CAMS CE. Therefore, the evaluators looked at data

9981base size, number of records and number of sites. The

9991evaluators found that some of the information was missing and

10001the information in the references reflected a project that was

10011smaller and less comple x than CAMS. These factors negatively

10021affected KPMG's scores. KPMG disputed the scoring of the

10030Testing Lead person. All of the evaluators, but one, gave the

10041Testing Lead a low score. The summary sheet for the Testing

10052Lead listed 25 years of experience in testing, but the résumé

10063had no specific testing experience listed. Several years of

10072specific testing experience would be expected to appear on the

10082résumé of the person selected as a Testing Lead for a large

10094project such as CAMS CE. The evaluators gave the résumé more

10105credence than the summary sheet. The résumé provided the

10114specific description of the person's experience and was prepared

10123by that person, whereas the KPMG employees working on the

10133project, and who prepared the proposal, prepared the exper ience

10143summary sheet.

1014579. KPMG challenged Esser's scoring of key staff. It

10154stated that Esser gave lower scores to key staff than the other

10166evaluators and that Esser gave points to key staff for CSE

10177experience. KPMG stated that CSE experience was not a criterion

10187for evaluation according to Table 8.2 of the ITN. However,

10197Table 8.2 stated that the listed criteria were not inclusively

10207listed. Table 8.2 asked that the evaluators consider relevant

10216experience, but did not specifically define relevant experi ence.

10225KPMG's complaint appears to be another indirect challenge to

10234specifications i.e., that the specifications in Table 8.2 did

10243not provide specific instruction to evaluators regarding the

10251definition of relevant experience. KPMG waived the right to

10260such a challenge by not protesting the specifications within the

10270appropriate 72 - hour period after their issuance.

1027880. CAMS CE is a CSE project. Logically, CSE experience

10288is relevant. KPMG's proposal listed CSE experience in its teams

10298skills matrix. Clearl y, the KPMG employees, who prepared the

10308proposal, believed that it was relevant experience. When asked

10317to define relevant experience from their understanding, KPMG's

10325witnesses also listed criteria that were not in Table 8.2. It

10336is apparent that Table 8.2 provided discretion to evaluators

10345regarding scoring of relevant experience.

1035081. KPMG further claimed that, in their Vendor Questions

10359and Answers, the Department stated that CSE experience was not

10369required. It was, therefore, unfair according to KPMG, to allow

10379Esser to use this criterion. However, that specific Vendor

10388Question and Answer referred to references for past corporate

10397experience, not to key staff résumés. It did not apply to

10408Esser's evaluation of key staff. Esser employed the same

10417criteri on when evaluating all proposals. The criterion was not

10427arbitrarily and capriciously applied.

1043182. KPMG disputed Esser's scores, as his scores were

10440uniformly lower than other evaluators. However, KPMG did not

10449provide any evidence of bias against KPMG by Esser. Esser

10459provided a description of the rationale he employed in

10468determining each score. The rationale was uniformly applied and

10477had no inherent bias against KPMG or its proposal. KPMG has not

10489met the burden of proof to sustain overturning Esser's scores.

1049983. KPMG stated that the entire scoring process for key

10509staff was flawed as Table 8.2 had a criterion of "references

10520not excellent," which would negatively affect the scores. The

10529Department did not check references at this stage in the

10539evalu ation process. As a less than excellent reference was a

10550detraction from a score, KPMG would not have gained points from

10561the reference checks. KPMG witnesses speculated that if the

10570Department had checked references, some missing information for

10578résumés co uld have been filled in and that key staff persons

10590would have received a higher score. However, KPMG failed to

10600offer any concrete examples. The evidence also demonstrated

10608that references were not checked for any proposals at this stage

10619in the evaluation process. There is no evidence that the

10629Department's failure to check references at this stage of the

10639evaluation process was arbitrary or capricious or created any

10648competitive disadvantage for KPMG.

1065284. KPMG challenged the Department's execution of the

10660divergent score process as not following the procedure laid out

10670in the Source Selection Plan . The Plan stated that the

10681coordinator must inform the evaluator of the reason for the

10691score divergence. KPMG interpreted that requirement to mean

10699that all eval uators must be informed which scores were

10709divergent. The team developing the ITN wished to avoid

10718disseminating the information of which scores were divergent.

10726The purpose of the divergent scoring process was to trigger a

10737review of the scoring of certain items to discover errors, not

10748to influence evaluators to change their scores to conform to a

10759norm.

1076085. Intervenor Deloitte submitted a reference for the

10768Suntax project, which was signed by Susan Wilson. Susan Wilson

10778is a CSE employee. KPMG contended that Wilson's signature on

10788this reference created an unfair advantage for Deloitte. Other

10797than speculation, it offered no evidence that any undue

10806influence had been exerted by Wilson. Contrarily, the

10814evaluators had little or no acquaintance with Wilson a nd were

10825not affected by her involvement in the Suntax project.

10834RECOMMENDATION

10835Having considered the foregoing Findings of Fact,

10842Conclusions of Law, the evidence of record and the pleadings and

10853arguments of the parties, it is, therefore,

10860RECOMMENDED that a final order be entered by the State of

10871Florida Department of Revenue upholding the proposed agency

10879action which disqualified KPMG from further participation in the

10888evaluation process regarding the subject CAMS CE Invitation to

10897Negotiate.

10898DONE AND ENTER ED this 26th day of September, 2002, in

10909Tallahassee, Leon County, Florida.

10913___________________________________

10914P. MICHAEL RUFF

10917Administrative Law Judge

10920Division of Administrative Hearings

10924The DeSoto Building

109271230 Apalachee Parkway

10930Tallahassee, Florida 32399 - 3060

10935(850) 488 - 9675 SUNCOM 278 - 9675

10943Fax Filing (850) 921 - 6847

10949www.doah.state.fl.us

10950Filed with Clerk of the

10955Division of Administrative Hearings

10959this 26th day of September, 2002.

10965COPIES FURN ISHED :

10969Cindy Horne, Esquire

10972Earl Black, Esquire

10975Department of Revenue

10978Post Office Box 6668

10982Tallahassee, Florida 32399 - 0100

10987Robert S. Cohen, Esquire

10991D. Andrew Byrne, Esquire

10995Cooper, Byrne, Blue & Schwartz, LLC

110011358 Thomaswood Drive

11004Tallahassee, Florida 32308

11007Seann M. Frazier, Esquire

11011Greenburg, Traurig, P.A.

11014101 East College Avenue

11018Tallahassee, Florida 32302

11021Bruce Hoffmann, General Counsel

11025Department of Revenue

11028204 Carlton Building

11031Tallahassee, Florida 32399 - 0100

11036James Zingale, Executive Director

11040Department of Revenue

11043104 Carlton Building

11046Tallahassee, Florida 32399 - 0100

11051NOTICE OF RIGHT TO SUBMIT EXCEPTIONS

11057All parties have the right to submit written exceptions within

1106710 days from the date of this Recommended Order. Any exceptions

11078to this Re commended Order should be filed with the agency that

11090will issue the Final Order in this case.

Select the PDF icon to view the document.
PDF
Date
Proceedings
PDF:
Date: 10/15/2002
Proceedings: Final Order filed.
PDF:
Date: 10/11/2002
Proceedings: Agency Final Order
PDF:
Date: 09/26/2002
Proceedings: Recommended Order
PDF:
Date: 09/26/2002
Proceedings: Recommended Order issued (hearing held June 24 and 26, 2002) CASE CLOSED.
PDF:
Date: 09/26/2002
Proceedings: Recommended Order cover letter identifying hearing record referred to the Agency sent out.
PDF:
Date: 08/02/2002
Proceedings: (Proposed) Proposed Recommended Order (filed by Petitioner via facsimile).
PDF:
Date: 08/02/2002
Proceedings: Deloitte Consulting, L.P.`s Proposed Recommended Order filed.
PDF:
Date: 08/01/2002
Proceedings: Respondent`s Proposed Recommended Order filed.
Date: 07/15/2002
Proceedings: Transcript filed.
Date: 06/26/2002
Proceedings: CASE STATUS: Hearing Held; see case file for applicable time frames.
PDF:
Date: 06/20/2002
Proceedings: (Joint) Prehearing Stipulation (filed via facsimile).
PDF:
Date: 06/13/2002
Proceedings: Petition to Intervene by Deloitte Consulting, Inc. (filed via facsimile).
PDF:
Date: 05/22/2002
Proceedings: Notice of Taking Deposition Duces Tecum, M. Strange, J. Focht(2) filed.
PDF:
Date: 05/10/2002
Proceedings: Order Granting Continuance and Re-scheduling Hearing issued (hearing set for June 24 and 26, 2002; 10:00 a.m.; Tallahassee, FL).
PDF:
Date: 05/07/2002
Proceedings: Amended Notice of Proceedings filed by Respondent.
PDF:
Date: 05/06/2002
Proceedings: Motion for Continuance (filed by Respondent via facsimile).
PDF:
Date: 05/03/2002
Proceedings: Order of Pre-hearing Instructions issued.
PDF:
Date: 05/03/2002
Proceedings: Notice of Hearing issued (hearing set for May 13, 2002; 10:00 a.m.; Tallahassee, FL).
PDF:
Date: 05/01/2002
Proceedings: Notice of Proceedings filed.
PDF:
Date: 05/01/2002
Proceedings: Formal Written Protest filed.
PDF:
Date: 05/01/2002
Proceedings: Agency referral filed.

Case Information

Judge:
P. MICHAEL RUFF
Date Filed:
05/01/2002
Date Assignment:
05/02/2002
Last Docket Entry:
10/15/2002
Location:
Tallahassee, Florida
District:
Northern
Agency:
ADOPTED IN TOTO
Suffix:
BID
 

Counsels

Related Florida Statute(s) (3):