the bar examiner volume 84, no. 1, march 2015 - national ... · pdf file2 the bar examiner,...

61
Articles 8 2014 STATISTICS 44 THE REVISED ABA STANDARDS FOR APPROVAL OF LAW SCHOOLS: AN OVERVIEW OF THE MAJOR CHANGES by Jeffrey E. Lewis Departments 2 LETTER FROM THE CHAIR by Bryan R. Williams 4 PRESIDENTS PAGE by Erica Moeser 54 THE TESTING COLUMN ESSAY GRADING FUNDAMENTALS by Judith A. Gundersen 57 NEWS AND EVENTS 60 LITIGATION UPDATE by Fred P. Parker III and Jessica Glad B ar E xaminer THE Volume 84 | Number 1 | March 2015 A publication of the National Conference of Bar Examiners

Upload: vuongkhanh

Post on 26-Mar-2018

215 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

Articles

8 2014 StatiSticS

44the ReviSed aBa StandaRdS foR appRoval of law SchoolS: an oveRview of the MajoR changeS by Jeffrey E. Lewis

Departments

2 letteR fRoM the chaiR

by Bryan R. Williams

4 pReSident’S page

by Erica Moeser

54 the teSting coluMn eSSay gRading fundaMentalS by Judith A. Gundersen

57 newS and eventS

60 litigation update

by Fred P. Parker III and Jessica Glad

Bar ExaminerTHE

Volume 84 | Number 1 | March 2015

A publication of the National Conference of Bar Examiners

Page 2: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

letteR fRoM the chaiR

Bar ExaminerEditor

Claire Huismann

Editorial Advisory CommitteeBeverly Tarpley, Chair

Steven M. BarkanBedford T. Bentley, Jr.

Victoria A. GraffeoPaul H. Mills

Madeleine J. NibertFred P. Parker III

Hon. Phyllis D. Thompson

Publication Production and Design

Melanie Hoffman

Editorial Assistant

Lisa Palzkill

Publisher

National Conference of Bar Examiners

NCBE Officers

ChairBryan R. Williams

PresidentErica Moeser

Immediate Past ChairMargaret Fuller Corneille

Chair-ElectHon. Thomas J. Bice

SecretaryRobert A. Chong

NCBE Board of TrusteesHulett H. Askew

Hon. Rebecca White BerchPatrick R. Dixon

Michele A. GavagniGordon J. MacDonald

Hon. Cynthia L. MartinSuzanne K. Richards

Hon. Phyllis D. Thompson

THE

2 The Bar Examiner, March 2015

The mission of the National Conference of Bar Examiners is twofold. It is to work with other institutions to develop, maintain, and apply rea-sonable and uniform standards of education and character for eligibility for admission to the practice of law. It is also to assist bar admission

authorities by, among other things, providing high-quality examinations for the testing of applicants for admission to the practice of law, conducting educational programs for bar admission authority members and staff, and providing other services such as character and fitness investigations and research.

In fulfilling its mission, NCBE sponsors a number of training and educational activities for the bar admissions community, provides assistance to jurisdictions on many fronts, and offers opportunities for involvement in its activities. I encourage bar examiners, administrators, judges, educators, and anyone affili-ated with bar admissions to become familiar with the wide range of activities that NCBE provides and to actively participate in those activities.

For those involved in the drafting and grading of bar exam questions, NCBE sponsors a biennial mini-conference for bar examiners that addresses best prac-tices in testing. Last fall, this two-day mini-conference, “Best Practices in Testing: A Mini-Conference for Bar Examiners,” held at NCBE’s headquarters in Madison, Wisconsin, focused on how bar examiners, especially new bar examiners, could better learn how to grade essays and how to stay calibrated in their grading over a period of time. For those who read and grade essays, this type of training is invaluable. The mini-conference also addressed the fundamental principles appli-cable to drafting high-quality questions, as well as desirable scoring methods to achieve the reliability required in high-stakes testing. Such best practices are also frequent topics at other NCBE educational events.

For those bar examiners from jurisdictions that use the Multistate Essay Examination (MEE) and/or the Multistate Performance Test (MPT), NCBE con-ducts grading workshops after the administration of each exam. These work-shops, led by workshop facilitators familiar with the questions and grading materials and by NCBE testing staff, provide excellent analysis of the questions and expose attendees to the best practices for grading the questions to achieve the goal of spreading the scores. The workshops can be attended in person or via conference call; the sessions are also videotaped and edited and made available on demand for graders to stream at their convenience beginning the week after the exam.

In addition to providing training, NCBE devotes considerable resources to providing educational opportunities for the bar examining community and state courts, such as its Annual Bar Admissions Conference. It is always important to know the trends in bar admissions and the latest hot-button issues experienced

Page 3: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

Letter from the Chair 3

by bar examiners throughout the country. This year’s Conference, held in Chicago from April 30 to May 3, will focus on, among other topics, a profile of the legal profession with particular emphasis on law school enrollment, law school debt, bar admission trends, and employment. Also sure to be of great interest are several sessions addressing specific character and fitness and ADA issues, as well as sessions discussing the admis-sion of foreign-trained lawyers and exploring the future of tablet technology in testing. Other educa-tional events sponsored by NCBE include biennial academic support conferences directed to law school faculty and administrators to help law schools maximize their students’ preparation for the bar exam, and mini-seminars educating select small audiences on a variety of topics.

Much of NCBE’s important work is done through its committees. One of my goals this year was to broaden the scope of committee membership by inviting a wider array of people involved in bar admissions from various juris-dictions. Standing committees on which people can serve include the following: Character and Fitness, Diversity Issues, Editorial Advisory, Education, Long Range Planning, Multistate Bar Examination, Multistate Essay Examination/Multistate Performance Test, Multistate Professional Responsibility Examination, Technology, and the Special Committee on the Uniform Bar Examination. Serving on a committee affords members the opportunity to learn about and participate in any of several aspects of bar admissions—whether exchanging ideas about the content of the questions that appear on NCBE exams and the methods for grading the questions, exploring technological advances in testing and grading, looking at issues that affect the diversity of the testing population, identifying and planning educational opportunities for the bar examining community, addressing issues relating to character and fitness, or participating in decisions concerning the content of this very magazine.

NCBE also assists jurisdictions with the more techni-cal aspects of testing and grading—an invaluable service provided by NCBE’s staff of experienced testing profes-sionals. Bar examiners in all jurisdictions face issues that go far beyond simply writing and grading the bar exam. Standard setting, reliability and validity of jurisdiction-drafted tests, and new and different testing methods are just a few of the issues faced by bar examiners for which NCBE provides guidance and expertise. For example, a number

of years ago, the New York State Board of Law Examiners, on which I serve as a bar examiner, considered a controversial policy issue that had been visited by other jurisdictions. New York is the largest jurisdiction in terms of the numbers of candidates tested (over 15,000 in 2014), and much of the content of its exam is actually drafted by the Board. For these reasons, the issues central to the admin-istration of the New York Bar Exam, and the policy concerns in New York, tend to be somewhat different from those in jurisdictions that test fewer applicants

and that may rely on NCBE to draft their test questions. NCBE’s assistance in performing demographic studies, as well as lending its expertise in analyzing the data received, was invaluable to the New York Board in considering the controversial policy issue.

A number of state boards, in addition to their policy functions, are responsible for evaluating the character and fitness of applicants to the bar. NCBE offers investigation services to state bar admission authorities to verify informa-tion presented by applicants on their applications for admis-sion to the bar—not only for U.S.-educated applicants, but also for foreign-educated applicants. Given the complexity of the verification process, this service saves jurisdictions time and resources.

I encourage members of the bar admissions community to become knowledgeable about and involved in NCBE activities. One of the best ways to do this is to attend NCBE’s Annual Bar Admissions Conference and to consider serving on one of the NCBE committees. I believe that committee work is one of the best ways to get to know the organiza-tion and that contributing time and talent to a committee, with the knowledge that it furthers NCBE’s mission and ultimately benefits the profession, can be very rewarding.

Best regards to all.

Sincerely,

Bryan R. Williams

Page 4: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

Several years ago I was elected

to serve on the Board of Trust-

ees of the small municipality

in which I live. I found it strik-

ing that often as soon as new residents

moved onto one of our Village streets,

they began clamoring to have the street

turned into a cul-de-sac, or at the very

least to have speed bumps the size of

small Alps installed to calm the traffic

that moved by at the very same pace as

when they had purchased their home.

I have recalled that phenomenon over the past

few months when thinking of how some legal

educators have reacted to the drop in MBE scores

earned during the July 2014 test administration.

The neighborhood street is lawyer licensing, essen-

tially unchanged, and the cul-de-sac is the wish for

the process to change. Some legal educators have

expressed hope that NCBE would confess error in

the equating of that particular July test. Others have

gone further, calling for an overhaul of the test. A

few educators (and I treasure them) have pointed out

that the test results simply reflect the circumstances

in which legal education finds itself these days. It is

the very stability of the MBE that has revealed the

beginning of what may be a continuing and trou-

bling slide in bar passage in many states.

As I have written previously, my first reaction

at seeing the results of the July 2014 MBE was one

of concern, and I acted on that concern long before

the test results were released. Having nailed down

that the results of the MBE equat-

ing process were correct, we sent the

scores out, and as one jurisdiction after

another completed its grading, the

impact of the decline in scores became

apparent. Not content to rest on the

pre-release replications of the results, I

continued—and continue—to have the

results studied and reproduced at the

behest of the NCBE Board, which is

itself composed of current and former

bar examiners as well as bar admission

administrators who are dedicated to doing things

correctly.

Legal education commentators have raised ques-

tions about the reliability, validity, integrity, and

fairness of the test and the processes by which it is

created and scored. One dean has announced pub-

licly that he smelled a rat. As perhaps the rat in ques-

tion, I find it difficult to offer an effective rejoinder.

Somehow “I am not a rat” does not get the job done.

The pages of this magazine are replete with

explanations of how NCBE examinations are crafted

and equated. The purpose of these articles has been

to demystify the process and foster transparency.

The material we publish, including the material

appearing in this column, details the steps that are

taken for quality control, and the qualifications of

those who draft questions and otherwise contribute

to the process. The words in the magazine—and

much more—are available on the NCBE website for

all those who are willing to take time to look.

4 The Bar Examiner, March 2015

pReSident’S pageby Erica Moeser

Page 5: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

President’s Page 5

The MBE is a good test, but we never rest on our

laurels. We are constantly working to make it better.

This is evidenced by the fact that the July 2014 MBE

had the highest reliability ever of .92 (“reliability”

being a term of art in measurement meant to commu-

nicate the degree to which an examinee’s score would

be likely to remain the same if the examinee were to

be tested again with a comparable but different set

of questions). The MBE is also a valid test (another

measurement term meant to communicate that the

content of the test lines up with the purpose for which

the test is administered—here the qualification of an

entry-level practitioner for a general license to prac-

tice in any field of law).

As to the validity of the MBE, and of the palette

of four tests that NCBE produces, the selection of

test material must be relevant to the purpose to be

served, here the testing of new entrants into a licensed

profession. The collective judgments of legal educa-

tors, judges, and practicing lawyers all contribute to

the construction of the test specifications that define

possible test coverage. Ultimately a valid and reliable

test that is carefully constructed and carefully scored

by experts in the field is a fair test.

Those unfamiliar with the job analysis NCBE

conducted about three years ago may find it useful to

explore by visiting the Publications tab on the NCBE

website. At the time of its release, I commended the

report to both bar examiners and legal educators

because the survey results spoke to the knowledge

and skills new lawyers felt to be of value to them as

they completed law school and struck out in various

practice settings. The job analysis effort was consis-

tent with our emphasis on testing what we believe the

new lawyer should know (as opposed to the arcane or

the highly complex). Of course, any bar examination

lasting a mere two days does no more than sample

what an examinee knows. We try to use testing time

as efficiently as possible to achieve the broadest pos-

sible sampling.

With regard to integrity, either one has earned

the trust that comes with years of performance or

one hasn’t. Courts and bar examiners have devel-

oped trust in the MBE over the 40-plus years it has

been administered. Some legal educators have not.

Frankly, many legal educators paid little or no atten-

tion to the content and scoring of bar examinations

until the current exigencies in legal education brought

testing for licensure into sharp focus for them.

Of immediate interest to us at NCBE is the result

of the introduction of Civil Procedure to the roster

of MBE topics that occurred this February. Civil

Procedure found its way onto the MBE by the same

process of broad consultation that has marked all

other changes to our tests as they have evolved. The

drafting committee responsible for this test content is

fairly characterized as blue-ribbon.

I recognize and appreciate the staggering chal-

lenges facing legal education today. I recently heard

an estimate that projected Fall 2015 first-year enroll-

ment at 35,000, down from 52,000 only a few years

ago. This falloff comes as more law schools are

appearing—204 law schools are currently accredited

by the American Bar Association, with several more

in the pipeline. Managing law school enrollment in

this environment is an uphill battle. Couple that with

regulatory requirements and the impact of law school

rankings, and one wonders why anyone without a

streak of masochism would become a law school

dean these days. The demands placed on today’s law

school deans are enormous.

NCBE welcomes the opportunity to increase com-

munication with legal educators, both to reveal what

Page 6: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

6 The Bar Examiner, March 2015

we know and do, and to better understand the issues

and pressures with which they are contending. We

want to be as helpful as possible, given our respec-

tive roles in the process, in developing the strategies

that will equip current and future learners for swift

entry into the profession.

Over the past two years, jurisdictions have

heeded the call to disclose name-specific information

about who passes and who fails the bar examination

to the law schools from which examinees graduate.

At this writing all but a few states make this infor-

mation available to law schools. NCBE has offered

to transmit the information for jurisdictions that lack

the personnel resources to do so. A chart reflecting

the current state of disclosures appears below. It

updates a chart that appeared in the December 2013

issue.

Summary: Pass/Fail Disclosure of Bar Exam Results(Updated chart from December 2013 Bar Examiner)

These jurisdictions automatically disclose name-specific pass/fail

information to the law schools from which test-takers

graduate.

These jurisdictions automatically disclose name-specific pass/fail

information to in-state law schools. Out-of-state law schools must request the

information.

These jurisdictions require all law schools to request

name-specific pass/fail information.

These jurisdictions provide limited

or no disclosure.

California Arkansas Alaska Disclosure with limitationsConnecticut Florida Arizona Hawaii*Georgia Indiana Colorado New Jersey†

Illinois Kentucky Delaware Texas‡

Iowa Louisiana District of Columbia No disclosureKansas Minnesota Idaho AlabamaMaine Mississippi Michigan GuamMaryland North Carolina Nevada Northern Mariana IslandsMassachusetts North Dakota Pennsylvania Puerto RicoMissouri Ohio South Carolina Republic of PalauMontana Rhode Island South DakotaNebraska Tennessee Virgin IslandsNew Hampshire West VirginiaNew Mexico WyomingNew YorkOklahomaOregonUtahVermontVirginiaWashington

Wisconsin

*Hawaii releases passing information only.†If the applicant executes a waiver, New Jersey will release information to law schools on request.‡Texas will disclose name-specific pass/fall information on request of the law school unless the applicant has requested that the information not be released.

NCBE is currently disseminating name-specific pass/fail information on behalf of Georgia, Iowa, Maine, Missouri, Montana, New Mexico, Oregon, and Vermont.

Page 7: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

President’s Page 7

Change in First-Year Enrollment from 2010 to 2013 and Reported Changes to the LSAT Score at the 25th Percentile

(Corrections to first-year enrollment data in the December 2014 Bar Examiner)

LAW SCHOOL

Total First-Year Enrollment % Change, 2010

to 2013

25% LSAT Score

2010 2011 2012 2013 2010 2011 2012 2013

DAYTON, UNIVERSITY OF 207 177 133 100 -52% 150 148 146 145

TOTALS: 52,106 47,276 43,155 39,674 -23.86%

The December 2014 Bar Examiner included a chart showing the change in total first-year enrollment from 2010 to 2013 as provided by the

ABA Section of Legal Education and Admissions to the Bar and the LSAT 25th percentile for each year. One law school informed us of a dis-

crepancy in need of correction; corrected information appears in the chart below.

I am delighted to announce that Kansas has

become the 15th state to adopt the Uniform Bar

Examination. The Kansas Supreme Court acted this

January on a recommendation of the Kansas Board

of Law Examiners. The first UBE administration in

Kansas will occur in February 2016, and Kansas will

begin accepting UBE score transfers in April of this

year.

In closing, I would like to reflect on the loss

of Scott Street to the bar admissions community.

Scott, who served for over 40 years as the Secretary-

Treasurer of the Virginia Board of Bar Examiners,

was both a founder of, and a mainstay in, the

Committee of Bar Admission Administrators, now

titled the Council of Bar Admission Administrators.

His was an important voice as the job of admissions

administrator became professionalized. Scott was

selected as a member of the NCBE Board of Trustees

and served the organization well in that capacity. He

was a leader and a gentleman, and it saddens those

of us who knew him to say good-bye.

Page 8: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

2014 StatiSticSThis section includes data, by jurisdiction, on the following categories for 2014:

• thenumberofpersonstakingandpassingbarexaminations;

• thenumbertakingandpassingbarexaminationscategorizedbysourceoflegaleducation;

• thenumberofandpassageratesforfirst-timeexamtakersandrepeaters,bothoverallandforgraduatesofABA-approvedlawschools;

• thenumberofandpassageratesforgraduatesofnon-ABA-approvedlawschoolsbytypeofschool;

• thenumberofattorneycandidatestakingandpassingspecialAttorneys’Examinations;and

• thenumberofdisbarredor suspendedattorneys takingandpassingexaminationsasaconditionofreinstatement.

Also included are the following:

• achartshowingalongitudinalviewofbarpassagerates,bothoverallandforfirst-timetakers,overa10-yearperiod;

• a five-year snapshot, by jurisdiction, of the number of persons admitted to the bar byexamination,onmotion,bytransferredUniformBarExamination(UBE)score(datacol-lection started by NCBE in 2013), and by diploma privilege, as well as the number ofindividualslicensedasforeignlegalconsultants;and

• achartdisplayingrelativeadmissionstothebarin2014byexamination,onmotion,andbydiplomaprivilege.

Dataforthefirst10chartsweresuppliedbythe jurisdictions.Inreviewingthedata,thereader

shouldkeepinmindthatsomeindividualsseekadmissioninmorethanonejurisdictioninagiven

year.Thechartsrepresentthedataasofthedatetheywerereceivedfromjurisdictionsandmay

not reflect possible subsequent appeals or pending issues that might affect the overall passing

statisticsforagivenjurisdiction.Statisticsareupdatedtoreflectanylaterchangesreceivedfrom

jurisdictionsandcanbefoundontheNCBEwebsite,www.ncbex.org.

ThefollowingnationaldataareshownfortheadministrationsoftheMultistateBarExamination

(MBE)andtheMultistateProfessionalResponsibilityExamination(MPRE):

• summarystatistics,• scoredistributions,• examineecountsovera10-yearperiod,and• meanscaledscoresovera10-yearperiod.

Theuse,byjurisdiction,isillustratedfortheMBE,theMPRE,theMultistateEssayExamination

(MEE),andtheMultistatePerformanceTest(MPT).

2014Statistics

TheBarExaminer,March20158

Page 9: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

2014 StatiSticS contentS

Persons Taking and Passing the 2014 Bar Examination ....................................................................... 10

Persons Taking and Passing the 2014 Bar Examination by Source of Legal Education .................. 12

First-Time Exam Takers and Repeaters in 2014 ..................................................................................... 14

2014 First-Time Exam Takers and Repeaters from ABA-Approved Law Schools ........................... 18

2014 Exam Takers and Passers from Non-ABA-Approved Law Schools by Type of School ......... 22

Attorneys’ Examinations in 2014 .............................................................................................................. 23

Examinations Administered to Disbarred or Suspended Attorneys as a Condition of Reinstatement in 2014 ................................................................................................................................ 23

Ten-Year Summary of Bar Passage Rates, 2005–2014 ........................................................................... 24

Admissions to the Bar by Type, 2010–2014 ............................................................................................ 28

2014 Admissions to the Bar by Examination, on Motion, and by Diploma Privilege ..................... 31

Multistate Bar Examination ..................................................................................................................... 32

Jurisdictions Using the MBE in 2014 .................................................................................................. 33

2014 MBE National Summary Statistics (Based on Scaled Scores) ................................................ 34

2014 MBE National Score Distributions ............................................................................................. 34

MBE National Examinee Counts, 2005–2014 .................................................................................... 35

MBE National Mean Scaled Scores, 2005–2014 ................................................................................. 35

Multistate Professional Responsibility Examination .........................................................................36

Jurisdictions Using the MPRE in 2014 (with Pass/Fail Standards Indicated) ........................... 37

2014 MPRE National Summary Statistics (Based on Scaled Scores) ............................................ 38

2014 MPRE National Score Distributions ........................................................................................ 38

MPRE National Examinee Counts, 2005–2014 ................................................................................ 39

MPRE National Mean Scaled Scores, 2005–2014 ............................................................................ 39

Multistate Essay Examination ................................................................................................................. 40

Jurisdictions Using the MEE in 2014 ................................................................................................ 41

Multistate Performance Test .................................................................................................................... 42

Jurisdictions Using the MPT in 2014 ................................................................................................43

92014 Statistics

Page 10: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

2014 Statistics

The Bar Examiner, March 201510

Persons Taking and Passing the 2014 Bar Examination

February July Total

Jurisdiction Taking Passing % Passing Taking Passing % Passing Taking Passing % Passing

Alabama 230 127 55% 522 337 65% 752 464 62%

Alaska 45 31 69% 74 48 65% 119 79 66%

Arizona 397 253 64% 667 456 68% 1,064 709 67%

Arkansas 139 88 63% 216 135 63% 355 223 63%

California 4,578 2,073 45% 8,504 4,135 49% 13,082 6,208 47%

Colorado 391 280 72% 847 631 74% 1,238 911 74%

Connecticut 278 199 72% 457 353 77% 735 552 75%

Delaware No February examination 192 121 63% 192 121 63%

District of Columbia 297 136 46% 264 87 33% 561 223 40%

Florida 1,315 820 62% 3,214 2,122 66% 4,529 2,942 65%

Georgia 574 364 63% 1,311 967 74% 1,885 1,331 71%

Hawaii 117 75 64% 169 116 69% 286 191 67%

Idaho 52 36 69% 113 76 67% 165 112 68%

Illinois 984 740 75% 2,398 1,940 81% 3,382 2,680 79%

Indiana 266 162 61% 552 400 72% 818 562 69%

Iowa 97 83 86% 253 206 81% 350 289 83%

Kansas 157 135 86% 188 148 79% 345 283 82%

Kentucky 198 152 77% 388 295 76% 586 447 76%

Louisiana 398 190 48% 762 532 70% 1,160 722 62%

Maine 61 41 67% 119 87 73% 180 128 71%

Maryland 567 342 60% 1,537 1,102 72% 2,104 1,444 69%

Massachusetts 679 414 61% 2,096 1,598 76% 2,775 2,012 73%

Michigan 681 444 65% 953 604 63% 1,634 1,048 64%

Minnesota 225 175 78% 747 593 79% 972 768 79%

Mississippi 111 90 81% 183 143 78% 294 233 79%

Missouri 262 211 81% 792 676 85% 1,054 887 84%

Montana 54 36 67% 126 81 64% 180 117 65%

Nebraska 42 18 43% 171 131 77% 213 149 70%

Nevada 224 128 57% 332 191 58% 556 319 57%

New Hampshire 61 46 75% 161 134 83% 222 180 81%

New Jersey 1,015 613 60% 3,297 2,445 74% 4,312 3,058 71%

New Mexico 137 111 81% 203 171 84% 340 282 83%

New York 4,032 1,902 47% 11,195 7,265 65% 15,227 9,167 60%

aExaminations in Puerto Rico are administered in March and September.

Page 11: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

112014 Statistics

Persons Taking and Passing the 2014 Bar Examination

February July Total

Jurisdiction Taking Passing % Passing Taking Passing % Passing Taking Passing % Passing

North Carolina 632 356 56% 1,207 746 62% 1,839 1,102 60%

North Dakota 42 26 62% 78 49 63% 120 75 63%

Ohio 440 283 64% 1,173 902 77% 1,613 1,185 73%

Oklahoma 121 85 70% 307 242 79% 428 327 76%

Oregon 213 140 66% 476 311 65% 689 451 65%

Pennsylvania 720 413 57% 1,981 1,496 76% 2,701 1,909 71%

Rhode Island 48 35 73% 176 128 73% 224 163 73%

South Carolina 252 158 63% 482 342 71% 734 500 68%

South Dakota 26 18 69% 84 61 73% 110 79 72%

Tennessee 304 194 64% 810 537 66% 1,114 731 66%

Texas 1,152 781 68% 2,929 2,091 71% 4,081 2,872 70%

Utah 147 113 77% 290 236 81% 437 349 80%

Vermont 47 32 68% 61 40 66% 108 72 67%

Virginia 547 325 59% 1,377 936 68% 1,924 1,261 66%

Washington 334 237 71% 886 685 77% 1,220 922 76%

West Virginia 81 57 70% 186 137 74% 267 194 73%

Wisconsin 95 68 72% 175 131 75% 270 199 74%

Wyoming 23 15 65% 60 45 75% 83 60 72%

Guam 9 7 78% 13 8 62% 22 15 68%

N. Mariana Islands 3 2 67% 5 5 100% 8 7 88%

Palau No February examination 17 3 18% 17 3 18%

Puerto Ricoa 523 178 34% 698 296 42% 1,221 474 39%

Virgin Islands 11 6 55% 19 16 84% 30 22 73%

TOTALS 24,434 14,044 57% 56,493 37,769 67% 80,927 51,813 64%aExaminations in Puerto Rico are administered in March and September.

(continued)

Page 12: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

2014 Statistics

The Bar Examiner, March 201512

Persons Taking and Passing the 2014 Bar Examination by Source of Legal Education

ABA-Approved Law School

Non-ABA-Approved Law Schoola

Law School Outside the USA Law Office Study

Jurisdiction Taking Passing % Passing Taking Passing % Passing Taking Passing % Passing Taking Passing % Passing

Alabama 469 401 86% 278 59 21% 5 4 80% — — —

Alaska 115 78 68% 2 1 50% 2 0 0% — — —

Arizona 1,057 705 67% 4 3 75% 3 1 33% — — —

Arkansas 355 223 63% — — — — — — — — —

California 8,786b,c 5,010b,c 57% 2,124b,c 419b,c 20% 1,031 148 14% 10 3 30%

Colorado 1,231 908 74% 4 1 25% 3 2 67% — — —

Connecticut 696 550 79% 39 2 5% — — — — — —

Delaware 192 121 63% — — — — — — — — —

District of Columbia 303 144 48% 14 1 7% 244 78 32% — — —

Florida 4,524 2,941 65% 5 1 20% — — — — — —

Georgia 1,858 1,327 71% 25 2 8% 2 2 100% — — —

Hawaii 286 191 67% — — — — — — — — —

Idaho 165 112 68% — — — — — — — — —

Illinois 3,318 2,656 80% 1 1 100% 63 23 37% — — —

Indiana 818 562 69% — — — — — — — — —

Iowa 348 289 83% — — — 2 0 0% — — —

Kansas 345 283 82% — — — — — — — — —

Kentucky 586 447 76% — — — — — — — — —

Louisiana 1,143 718 63% — — — 17 4 24% — — —

Maine 173 124 72% 6 4 67% 1 0 0% — — —

Maryland 2,086 1,436 69% 2 2 100% 16 6 38% — — —

Massachusetts 2,443 1,902 78% 291 95 33% 41 15 37% — — —

Michigan 1,630 1,048 64% — — — 4 0 0% — — —

Minnesota 972 768 79% — — — — — — — — —

Mississippi 294 233 79% — — — — — — — — —

Missouri 1,045 883 84% 2 2 100% 7 2 29% — — —

aSee page 22 for a breakdown of exam takers and passers from non-ABA-approved law schools by type of school.bCalifornia does not recognize U.S. attorneys taking the General Bar Examination as being from either ABA-approved or non-ABA-approved law schools. This number of applicants (1,078 taking, 624 passing) is therefore omitted from either category. California’s “U.S. Attorneys Taking the General Bar Exam” category is composed of attorneys admitted in other jurisdictions less than four years who must take, and those admitted four or more years who have elected to take, the General Bar Examination.cApplicants under California’s four-year qualification rule who did not earn J.D. degrees (53 taking, 4 passing) are not included in either the ABA-approved or non-ABA-approved category. California’s four-year qualification rule allows applicants to take the General Bar Examination through a combination of four years of law study without graduating from a law school.

Page 13: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

132014 Statistics

Persons Taking and Passing the 2014 Bar Examination by Source of Legal Education

ABA-Approved Law School

Non-ABA-Approved Law Schoola

Law School Outside the USA Law Office Study

Jurisdiction Taking Passing % Passing Taking Passing % Passing Taking Passing % Passing Taking Passing % Passing

Montana 180 117 65% — — — — — — — — —

Nebraska 213 149 70% — — — — — — — — —

Nevada 549 316 58% 3 1 33% 4 2 50% — — —

New Hampshire 204 168 82% 18 12 67% — — — — — —

New Jersey 4,312 3,058 71% — — — — — — — — —

New Mexico 338 282 83% 2 0 0% — — — — — —

New York 10,392 7,596 73% 6 1 17% 4,813 1,565 33% 16 5 31%

North Carolina 1,839 1,102 60% — — — — — — — — —

North Dakota 120 75 63% — — — — — — — — —

Ohio 1,593 1,181 74% — — — 20 4 20% — — —

Oklahoma 428 327 76% — — — — — — — — —

Oregon 682 450 66% 1 1 100% 6 0 0% — — —

Pennsylvania 2,697 1,909 71% 1 0 0% 3 0 0% — — —

Rhode Island 224 163 73% — — — — — — — — —

South Carolina 734 500 68% — — — — — — — — —

South Dakota 110 79 72% — — — — — — — — —

Tennessee 841 593 71% 265 138 52% 8 0 0% — — —

Texas 4,037 2,860 71% 14 6 43% 30 6 20% — — —

Utah 437 349 80% — — — — — — — — —

Vermont 99 67 68% — — — 1 1 100% 8 4 50%

Virginia 1,903 1,259 66% — — — 10 0 0% 11 2 18%

Washington 1,187 907 76% — — — 17 6 35% 16 9 56%

West Virginia 267 194 73% — — — — — — — — —

Wisconsin 260 197 76% 1 1 100% 9 1 11% — — —

Wyoming 83 60 72% — — — — — — — — —

Guam 22 15 68% — — — — — — — — —

N. Mariana Islands 8 7 88% — — — — — — — — —

Palau 6 1 17% 1 0 0% 10 2 20% — — —

Puerto Rico 1,192 466 39% 29 8 28% — — — — — —

Virgin Islands 30 22 73% — — — — — — — — —

TOTALS 70,225 48,529 69% 3,138 761 24% 6,372 1,872 29% 61 23 38%

aSee page 22 for a breakdown of exam takers and passers from non-ABA-approved law schools by type of school.

(continued)

Page 14: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

2014 Statistics

The Bar Examiner, March 201514

First-Time Exam Takers and Repeaters in 2014a

First-Timers RepeatersJurisdiction 2014 Administration Taking Passing % Passing Taking Passing % Passing

Alabama February 128 99 77% 102 28 27%

July 418 331 79% 104 6 6%

Total 546 430 79% 206 34 17%

Alaska February 33 26 79% 12 5 42%

July 61 47 77% 13 1 8%

Total 94 73 78% 25 6 24%

Arizona February 280 199 71% 117 54 46%

July 564 421 75% 103 35 34%

Total 844 620 73% 220 89 40%

Arkansas February 81 66 81% 58 22 38%

July 177 129 73% 39 6 15%

Total 258 195 76% 97 28 29%

California February 1,492 822 55% 3,086 1,251 41%

July 6,220 3,818 61% 2,284 317 14%

Total 7,712 4,640 60% 5,370 1,568 29%

Colorado February 281 220 78% 110 60 55%

July 787 616 78% 60 15 25%

Total 1,068 836 78% 170 75 44%

Connecticut February 192 167 87% 86 32 37%

July 408 346 85% 49 7 14%

Total 600 513 86% 135 39 29%

Delaware February No February examination

July 156 107 69% 36 14 39%

Total 156 107 69% 36 14 39%

Dist. of Columbia February 179 110 61% 118 26 22%

July 140 73 52% 124 14 11%

Total 319 183 57% 242 40 17%

Florida February 805 587 73% 510 233 46%

July 2,864 2,057 72% 350 65 19%

Total 3,669 2,644 72% 860 298 35%

Georgia February 339 272 80% 235 92 39%

July 1,133 909 80% 178 58 33%

Total 1,472 1,181 80% 413 150 36%

Hawaii February 83 60 72% 34 15 44%

July 145 109 75% 24 7 29%

Total 228 169 74% 58 22 38%

Idaho February 41 29 71% 11 7 64%

July 101 75 74% 12 1 8%

Total 142 104 73% 23 8 35%

Illinois February 661 552 84% 323 188 58%

July 2,203 1,881 85% 195 59 30%

Total 2,864 2,433 85% 518 247 48%

Indiana February 152 119 78% 114 43 38%

July 474 378 80% 78 22 28%

Total 626 497 79% 192 65 34%aFirst-time exam takers are defined as examinees taking the bar examination for the first time in the reporting jurisdiction. Repeaters are defined as examinees who have taken the bar examination in the reporting jurisdiction at least once prior to the listed administration.

Page 15: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

152014 Statistics

First-Time Exam Takers and Repeaters in 2014a

First-Timers RepeatersJurisdiction 2014 Administration Taking Passing % Passing Taking Passing % Passing

Iowa February 81 73 90% 16 10 63%

July 245 202 82% 8 4 50%

Total 326 275 84% 24 14 58%

Kansas February 132 122 92% 25 13 52%

July 176 144 82% 12 4 33%

Total 308 266 86% 37 17 46%

Kentucky February 122 98 80% 76 54 71%

July 355 286 81% 33 9 27%

Total 477 384 81% 109 63 58%

Louisiana February 150 71 47% 248 119 48%

July 572 429 75% 190 103 54%

Total 722 500 69% 438 222 51%

Maine February 33 26 79% 28 15 54%

July 107 81 76% 12 6 50%

Total 140 107 76% 40 21 53%

Maryland February 267 190 71% 300 152 51%

July 1,359 1,049 77% 178 53 30%

Total 1,626 1,239 76% 478 205 43%

Massachusetts February 388 283 73% 291 131 45%

July 1,877 1,545 82% 219 53 24%

Total 2,265 1,828 81% 510 184 36%

Michigan February 382 271 71% 299 173 58%

July 769 563 73% 184 41 22%

Total 1,151 834 72% 483 214 44%

Minnesota February 149 132 89% 76 43 57%

July 703 581 83% 44 12 27%

Total 852 713 84% 120 55 46%

Mississippi February 77 69 90% 34 21 62%

July 156 133 85% 27 10 37%

Total 233 202 87% 61 31 51%

Missouri February 205 175 85% 57 36 63%

July 753 662 88% 39 14 36%

Total 958 837 87% 96 50 52%

Montana February 41 31 76% 13 5 38%

July 114 77 68% 12 4 33%

Total 155 108 70% 25 9 36%

Nebraska February 19 11 58% 23 7 30%

July 157 124 79% 14 7 50%

Total 176 135 77% 37 14 38%

Nevada February 143 96 67% 81 32 40%

July 261 179 69% 71 12 17%

Total 404 275 68% 152 44 29%aFirst-time exam takers are defined as examinees taking the bar examination for the first time in the reporting jurisdiction. Repeaters are defined as examinees who have taken the bar examination in the reporting jurisdiction at least once prior to the listed administration.

(continued)

Page 16: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

2014 Statistics

The Bar Examiner, March 201516

First-Time Exam Takers and Repeaters in 2014a

First-Timers RepeatersJurisdiction 2014 Administration Taking Passing % Passing Taking Passing % Passing

New Hampshire February 46 39 85% 15 7 47%

July 151 130 86% 10 4 40%

Total 197 169 86% 25 11 44%

New Jersey February 591 393 66% 424 220 52%

July 3,041 2,360 78% 256 85 33%

Total 3,632 2,753 76% 680 305 45%

New Mexico February 116 102 88% 21 9 43%

July 180 158 88% 23 13 57%

Total 296 260 88% 44 22 50%

New York February 1,490 918 62% 2,542 984 39%

July 9,231 6,872 74% 1,964 393 20%

Total 10,721 7,790 73% 4,506 1,377 31%

North Carolina February 267 171 64% 365 185 51%

July 821 698 85% 386 48 12%

Total 1,088 869 80% 751 233 31%

North Dakota February 31 21 68% 11 5 45%

July 66 42 64% 12 7 58%

Total 97 63 65% 23 12 52%

Ohio February 247 194 79% 193 89 46%

July 1,055 858 81% 118 44 37%

Total 1,302 1,052 81% 311 133 43%

Oklahoma February 66 56 85% 55 29 53%

July 285 239 84% 22 3 14%

Total 351 295 84% 77 32 42%

Oregon February 134 107 80% 79 33 42%

July 419 298 71% 57 13 23%

Total 553 405 73% 136 46 34%

Pennsylvania February 344 249 72% 376 164 44%

July 1,747 1,440 82% 234 56 24%

Total 2,091 1,689 81% 610 220 36%

Rhode Island February 25 19 76% 23 16 70%

July 164 127 77% 12 1 8%

Total 189 146 77% 35 17 49%

South Carolina February 170 120 71% 82 38 46%

July 413 308 75% 69 34 49%

Total 583 428 73% 151 72 48%

South Dakota February 17 13 76% 9 5 56%

July 80 60 75% 4 1 25%

Total 97 73 75% 13 6 46%

Tennessee February 185 134 72% 119 60 50%

July 712 514 72% 98 23 23%

Total 897 648 72% 217 83 38%aFirst-time exam takers are defined as examinees taking the bar examination for the first time in the reporting jurisdiction. Repeaters are defined as examinees who have taken the bar examination in the reporting jurisdiction at least once prior to the listed administration.

(continued)

Page 17: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

172014 Statistics

First-Time Exam Takers and Repeaters in 2014a

First-Timers RepeatersJurisdiction 2014 Administration Taking Passing % Passing Taking Passing % Passing

Texas February 742 570 77% 410 211 51%

July 2,548 1,965 77% 381 126 33%

Total 3,290 2,535 77% 791 337 43%

Utah February 111 95 86% 36 18 50%

July 261 228 87% 29 8 28%

Total 372 323 87% 65 26 40%

Vermont February 32 27 84% 15 5 33%

July 47 32 68% 14 8 57%

Total 79 59 75% 29 13 45%

Virginia February 263 184 70% 284 141 50%

July 1,216 886 73% 161 50 31%

Total 1,479 1,070 72% 445 191 43%

Washington February 215 170 79% 119 67 56%

July 815 653 80% 71 32 45%

Total 1,030 823 80% 190 99 52%

West Virginia February 43 39 91% 38 18 47%

July 166 132 80% 20 5 25%

Total 209 171 82% 58 23 40%

Wisconsin February 78 61 78% 17 7 41%

July 155 127 82% 20 4 20%

Total 233 188 81% 37 11 30%

Wyoming February 15 11 73% 8 4 50%

July 54 43 80% 6 2 33%

Total 69 54 78% 14 6 43%

Guam February 4 3 75% 5 4 80%

July 9 7 78% 4 1 25%

Total 13 10 77% 9 5 56%

N. Mariana Islands

February 3 2 67% — — —

July 5 5 100% — — —

Total 8 7 88% — — —

Palau February No February examination

July 13 2 15% 4 1 25%

Total 13 2 15% 4 1 25%

Puerto Ricob February 152 56 37% 371 122 33%

July 451 213 47% 247 83 34%

Total 603 269 45% 618 205 33%

Virgin Islands February 7 4 57% 4 2 50%

July 15 13 87% 4 3 75%

Total 22 17 77% 8 5 63%

TOTALS February 12,330 8,734 71% 12,104 5,310 44%

July 47,575 35,762 75% 8,918 2,007 23%

Total 59,905 44,496 74% 21,022 7,317 35%aFirst-time exam takers are defined as examinees taking the bar examination for the first time in the reporting jurisdiction. Repeaters are defined as examinees who have taken the bar examination in the reporting jurisdiction at least once prior to the listed administration.bExaminations in Puerto Rico are administered in March and September.

(continued)

Page 18: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

2014 Statistics

The Bar Examiner, March 201518

2014 First-Time Exam Takers and Repeatersfrom ABA-Approved Law Schoolsa

ABA First-Timers ABA Repeaters

Jurisdiction 2014 Administration Taking Passing % Passing Taking Passing % Passing

Alabama February 85 78 92% 25 14 56%

July 348 305 88% 11 4 36%

Total 433 383 88% 36 18 50%

Alaska February 33 26 79% 11 5 45%

July 59 46 78% 12 1 8%

Total 92 72 78% 23 6 26%

Arizona February 276 198 72% 116 53 46%

July 563 420 75% 102 34 33%

Total 839 618 74% 218 87 40%

Arkansas February 81 66 81% 58 22 38%

July 177 129 73% 39 6 15%

Total 258 195 76% 97 28 29%

California February 736 441 60% 1,849 935 51%

July 5,102 3,415 67% 1,099 219 20%

Total 5,838 3,856 66% 2,948 1,154 39%

Colorado February 280 219 78% 109 60 55%

July 784 615 78% 58 14 24%

Total 1,064 834 78% 167 74 44%

Connecticut February 183 165 90% 74 32 43%

July 400 346 87% 39 7 18%

Total 583 511 88% 113 39 35%

Delaware February No February examination

July 156 107 69% 36 14 39%

Total 156 107 69% 36 14 39%

Dist. of Columbia February 107 74 69% 45 8 18%

July 94 60 64% 57 2 4%

Total 201 134 67% 102 10 10%

Florida February 804 586 73% 509 233 46%

July 2,862 2,057 72% 349 65 19%

Total 3,666 2,643 72% 858 298 35%

Georgia February 339 272 80% 219 90 41%

July 1,133 909 80% 167 56 34%

Total 1,472 1,181 80% 386 146 38%

Hawaii February 83 60 72% 34 15 44%

July 145 109 75% 24 7 29%

Total 228 169 74% 58 22 38%

Idaho February 41 29 71% 11 7 64%

July 101 75 74% 12 1 8%

Total 142 104 73% 23 8 35%

Illinois February 644 544 84% 313 187 60%

July 2,179 1,867 86% 182 58 32%

Total 2,823 2,411 85% 495 245 49%

Indiana February 152 119 78% 114 43 38%

July 474 378 80% 78 22 28%

Total 626 497 79% 192 65 34%aFirst-time exam takers are defined as examinees taking the bar examination for the first time in the reporting jurisdiction. Repeaters are defined as examinees who have taken the bar examination in the reporting jurisdiction at least once prior to the listed administration.

Page 19: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

192014 Statistics

2014 First-Time Exam Takers and Repeatersfrom ABA-Approved Law Schoolsa

ABA First-Timers ABA Repeaters

Jurisdiction 2014 Administration Taking Passing % Passing Taking Passing % Passing

Iowa February 81 73 90% 15 10 67%

July 245 202 82% 7 4 57%

Total 326 275 84% 22 14 64%

Kansas February 132 122 92% 25 13 52%

July 176 144 82% 12 4 33%

Total 308 266 86% 37 17 46%

Kentucky February 122 98 80% 76 54 71%

July 355 286 81% 33 9 27%

Total 477 384 81% 109 63 58%

Louisiana February 145 69 48% 244 118 48%

July 570 429 75% 184 102 55%

Total 715 498 70% 428 220 51%

Maine February 30 24 80% 27 14 52%

July 105 81 77% 11 5 45%

Total 135 105 78% 38 19 50%

Maryland February 264 190 72% 295 149 51%

July 1,351 1,045 77% 176 52 30%

Total 1,615 1,235 76% 471 201 43%

Massachusetts February 323 256 79% 197 103 52%

July 1,796 1,506 84% 127 37 29%

Total 2,119 1,762 83% 324 140 43%

Michigan February 443 271 61% 235 173 74%

July 769 563 73% 183 41 22%

Total 1,212 834 69% 418 214 51%

Minnesota February 149 132 89% 76 43 57%

July 703 581 83% 44 12 27%

Total 852 713 84% 120 55 46%

Mississippi February 77 69 90% 34 21 62%

July 156 133 85% 27 10 37%

Total 233 202 87% 61 31 51%

Missouri February 202 172 85% 56 36 64%

July 749 661 88% 38 14 37%

Total 951 833 88% 94 50 53%

Montana February 41 31 76% 13 5 38%

July 114 77 68% 12 4 33%

Total 155 108 70% 25 9 36%

Nebraska February 19 11 58% 23 7 30%

July 157 124 79% 14 7 50%

Total 176 135 77% 37 14 38%

Nevada February 143 96 67% 77 31 40%

July 259 177 68% 70 12 17%

Total 402 273 68% 147 43 29%aFirst-time exam takers are defined as examinees taking the bar examination for the first time in the reporting jurisdiction. Repeaters are defined as examinees who have taken the bar examination in the reporting jurisdiction at least once prior to the listed administration.

(continued)

Page 20: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

2014 Statistics

The Bar Examiner, March 201520

2014 First-Time Exam Takers and Repeatersfrom ABA-Approved Law Schoolsa

ABA First-Timers ABA Repeaters

Jurisdiction 2014 Administration Taking Passing % Passing Taking Passing % Passing

New Hampshire February 42 32 76% 9 7 78%

July 145 126 87% 8 3 38%

Total 187 158 84% 17 10 59%

New Jersey February 591 393 66% 424 220 52%

July 3,041 2,360 78% 256 85 33%

Total 3,632 2,753 76% 680 305 45%

New Mexico February 116 102 88% 20 9 45%

July 180 158 88% 22 13 59%

Total 296 260 88% 42 22 52%

New York February 1,284 718 56% 975 641 66%

July 7,302 6,031 83% 831 206 25%

Total 8,586 6,749 79% 1,806 847 47%

North Carolina February 267 171 64% 365 185 51%

July 821 698 85% 386 48 12%

Total 1,088 869 80% 751 233 31%

North Dakota February 31 21 68% 11 5 45%

July 66 42 64% 12 7 58%

Total 97 63 65% 23 12 52%

Ohio February 243 192 79% 189 89 47%

July 1,050 857 82% 111 43 39%

Total 1,293 1,049 81% 300 132 44%

Oklahoma February 66 56 85% 55 29 53%

July 285 239 84% 22 3 14%

Total 351 295 84% 77 32 42%

Oregon February 133 107 80% 76 32 42%

July 417 298 71% 56 13 23%

Total 550 405 74% 132 45 34%

Pennsylvania February 344 249 72% 375 164 44%

July 1,745 1,440 83% 233 56 24%

Total 2,089 1,689 81% 608 220 36%

Rhode Island February 25 19 76% 23 16 70%

July 164 127 77% 12 1 8%

Total 189 146 77% 35 17 49%

South Carolina February 170 120 71% 82 38 46%

July 413 308 75% 69 34 49%

Total 583 428 73% 151 72 48%

South Dakota February 17 13 76% 9 5 56%

July 80 60 75% 4 1 25%

Total 97 73 75% 13 6 46%

Tennessee February 119 88 74% 54 31 57%

July 615 459 75% 53 15 28%

Total 734 547 75% 107 46 43%aFirst-time exam takers are defined as examinees taking the bar examination for the first time in the reporting jurisdiction. Repeaters are defined as examinees who have taken the bar examination in the reporting jurisdiction at least once prior to the listed administration.

(continued)

Page 21: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

212014 Statistics

2014 First-Time Exam Takers and Repeatersfrom ABA-Approved Law Schoolsa

ABA First-Timers ABA Repeaters

Jurisdiction 2014 Administration Taking Passing % Passing Taking Passing % Passing

Texas February 726 565 78% 406 210 52%

July 2,534 1,959 77% 371 126 34%

Total 3,260 2,524 77% 777 336 43%

Utah February 111 95 86% 36 18 50%

July 261 228 87% 29 8 28%

Total 372 323 87% 65 26 40%

Vermont February 28 24 86% 10 3 30%

July 54 33 61% 7 7 100%

Total 82 57 70% 17 10 59%

Virginia February 260 182 70% 275 141 51%

July 1,215 886 73% 153 50 33%

Total 1,475 1,068 72% 428 191 45%

Washington February 209 165 79% 116 66 57%

July 793 644 81% 69 32 46%

Total 1,002 809 81% 185 98 53%

West Virginia February 43 39 91% 38 18 47%

July 166 132 80% 20 5 25%

Total 209 171 82% 58 23 40%

Wisconsin February 76 61 80% 14 7 50%

July 153 125 82% 17 4 24%

Total 229 186 81% 31 11 35%

Wyoming February 15 11 73% 8 4 50%

July 54 43 80% 6 2 33%

Total 69 54 78% 14 6 43%

Guam February 4 3 75% 5 4 80%

July 9 7 78% 4 1 25%

Total 13 10 77% 9 5 56%

N. Mariana Islands

February 3 2 67% — — —

July 5 5 100% — — —

Total 8 7 88% — — —

Palau February No February examination

July 4 0 0% 2 1 50%

Total 4 0 0% 2 1 50%

Puerto Ricob February 152 56 37% 353 116 33%

July 451 213 47% 236 81 34%

Total 603 269 45% 589 197 33%

Virgin Islands February 7 4 57% 4 2 50%

July 15 13 87% 4 3 75%

Total 22 17 77% 8 5 63%

TOTALS February 11,097 7,979 72% 8,812 4,541 52%

July 44,120 34,338 78% 6,196 1,671 27%

Total 55,217 42,317 77% 15,008 6,212 41%aFirst-time exam takers are defined as examinees taking the bar examination for the first time in the reporting jurisdiction. Repeaters are defined as examinees who have taken the bar examination in the reporting jurisdiction at least once prior to the listed administration.bExaminations in Puerto Rico are administered in March and September.

(continued)

Page 22: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

2014 Statistics

The Bar Examiner, March 201522

2014 Exam Takers and Passers from Non-ABA-Approved Law Schools by Type of School

Conventional Law Schoola Correspondence Law Schoolb Online Law Schoolc

Jurisdiction Taking Passing % Passing Taking Passing % Passing Taking Passing % Passing

Alabama 278 59 21% — — — — — —

Alaska 2 1 50% — — — — — —

Arizona 4 3 75% — — — — — —

Californiad 1,540 335 22% 158 33 21% 317 48 15%

Colorado 4 1 25% — — — — — —

Connecticut 39 2 5% — — — — — —

District of Columbia 8 1 13% — — — 6 0 0%

Florida 5 1 20% — — — — — —

Georgia 25 2 8% — — — — — —

Illinois 1 1 100% — — — — — —

Maine 6 4 67% — — — — — —

Maryland — — — — — — 2 2 100%

Massachusetts 291 95 33% — — — — — —

Missouri 2 2 100% — — — — — —

Nevada 3 1 33% — — — — — —

New Hampshire 18 12 67% — — — — — —

New Mexico 2 0 0% — — — — — —

New York 6 1 17% — — — — — —

Oregon 1 1 100% — — — — — —

Pennsylvania 1 0 0% — — — — — —

Tennessee 265 138 52% — — — — — —

Texas 14 6 43% — — — — — —

Wisconsin — — — — — — 1 1 100%

Palau 1 0 0% — — — — — —

Puerto Rico 29 8 28% — — — — — —

TOTALS 2,545 674 26% 158 33 21% 326 51 16%

aConventional law schools are fixed-facility schools that conduct instruction principally in physical classroom facilities.bCorrespondence law schools are schools that conduct instruction principally by correspondence.cOnline law schools are schools that conduct instruction and provide interactive classes principally by technological transmission, including Internet transmission and electronic conferencing.dCalifornia applicants from non-ABA-approved law schools also include those who attended schools no longer in operation, composed of an unverifiable mixture of conventional, correspondence, and online schools. This number of applicants (109 taking, 3 passing) is therefore omitted from this chart.

Page 23: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

232014 Statistics

Attorneys’ Examinationsa in 2014

February July Total

Jurisdiction Taking Passing % Passing Taking Passing % Passing Taking Passing % Passing

California 510 275 54% 417 131 31% 927 406 44%

Georgia 136 125 92% 114 89 78% 250 214 86%

Idaho 12 11 92% 13 6 46% 25 17 68%

Maine 22 20 91% 14 10 71% 36 30 83%

Maryland 87 67 77% 104 99 95% 191 166 87%

Rhode Island 22 19 86% 12 7 58% 34 26 76%

Vermont — — — 61 40 66% 61 40 66%

Guam — — — 2 0 0% 2 0 0%

N. Mariana Islands 2 1 50% 1 1 100% 3 2 67%

Virgin Islands 4 2 50% — — — 4 2 50%

TOTALS 795 520 65% 738 383 52% 1,533 903 59%

aAttorneys’ Examination refers to a short form or other form of bar examination administered to attorneys admitted in other jurisdictions.

Examinations Administered to Disbarred or Suspended Attorneys as a Condition of Reinstatement in 2014a

Jurisdiction Taking Passing % Passing

Arizona 3 1 33%

Arkansas 1 1 100%

California 33 2 6%

Colorado 2 2 100%

Floridab 8 2 25%

South Carolina 5 3 60%

Texas 6 0 0%

TOTALS 58 11 19%aThe form of examination administered to disbarred or suspended attorneys varied among jurisdictions as follows: regular bar examination (5 jurisdictions), local component only (1 jurisdiction), Attorneys’ Examination (1 jurisdiction). bFlorida reports only a subset of suspended attorneys who are required to take the Florida portion of the examination only. Disbarred and other suspended attorneys who are required to take the regular bar examination are reported with other test takers.

Page 24: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

2014 Statistics

The Bar Examiner, March 201524

Ten-Year Summary of Bar Passage Rates, 2005–2014Jurisdiction 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014

Alabama Overall 64% 65% 64% 67% 65% 67% 65% 64% 64% 62%

First-Time 80% 80% 78% 79% 77% 78% 77% 76% 78% 79%

Alaska Overall 63% 62% 60% 70% 58% 71% 59% 67% 66% 66%

First-Time 75% 75% 82% 80% 72% 81% 71% 78% 80% 78%

Arizona Overall 67% 68% 70% 76% 73% 73% 70% 75% 73% 67%

First-Time 72% 75% 78% 84% 80% 81% 76% 80% 78% 73%

Arkansas Overall 70% 69% 70% 72% 67% 65% 71% 68% 65% 63%

First-Time 78% 80% 80% 83% 74% 72% 84% 76% 76% 76%

California Overall 46% 47% 49% 54% 49% 49% 51% 51% 51% 47%

First-Time 62% 65% 66% 71% 66% 65% 67% 65% 65% 60%

Colorado Overall 68% 68% 69% 73% 74% 74% 79% 77% 76% 74%

First-Time 78% 76% 78% 83% 85% 83% 86% 84% 82% 78%

Connecticut Overall 74% 75% 77% 78% 75% 71% 71% 73% 73% 75%

First-Time 81% 83% 86% 87% 83% 81% 82% 82% 81% 86%

Delaware Overall 57% 59% 62% 73% 63% 66% 67% 63% 72% 63%

First-Time 63% 67% 71% 80% 71% 72% 73% 69% 78% 69%

District of Columbia Overall 51% 51% 54% 56% 49% 41% 48% 51% 47% 40%

First-Time 69% 72% 76% 70% 65% 60% 69% 68% 61% 57%

Florida Overall 60% 64% 66% 71% 68% 69% 72% 71% 70% 65%

First-Time 71% 75% 78% 81% 78% 78% 80% 79% 78% 72%

Georgia Overall 73% 76% 75% 79% 76% 75% 76% 75% 76% 71%

First-Time 84% 86% 85% 89% 86% 84% 85% 84% 85% 80%

Hawaii Overall 71% 71% 70% 76% 76% 68% 75% 68% 73% 67%

First-Time 81% 77% 82% 88% 86% 77% 83% 75% 81% 74%

Idaho Overall 74% 79% 76% 72% 81% 78% 79% 80% 79% 68%

First-Time 80% 85% 81% 80% 86% 83% 85% 86% 83% 73%

Illinois Overall 78% 79% 82% 85% 84% 84% 83% 81% 82% 79%

First-Time 85% 87% 89% 91% 91% 89% 89% 87% 88% 85%

Indiana Overall 75% 76% 76% 78% 75% 75% 74% 72% 74% 69%

First-Time 84% 84% 84% 84% 83% 81% 83% 79% 83% 79%

Page 25: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

252014 Statistics

Ten-Year Summary of Bar Passage Rates, 2005–2014Jurisdiction 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014

Iowa Overall 80% 81% 83% 85% 88% 87% 84% 88% 88% 83%

First-Time 86% 88% 89% 90% 93% 91% 90% 92% 93% 84%

Kansas Overall 76% 82% 87% 86% 82% 84% 86% 84% 85% 82%

First-Time 81% 90% 91% 89% 86% 90% 89% 89% 89% 86%

Kentucky Overall 72% 73% 77% 77% 77% 77% 80% 76% 75% 76%

First-Time 80% 82% 87% 83% 86% 82% 86% 82% 81% 81%

Louisiana Overall 69% 70% 61% 62% 69% 61% 66% 59% 50% 62%

First-Time 72% 76% 63% 66% 72% 65% 70% 63% 58% 69%

Maine Overall 70% 73% 80% 86% 77% 88% 68% 68% 76% 71%

First-Time 81% 81% 84% 91% 82% 89% 73% 73% 81% 76%

Maryland Overall 65% 66% 67% 75% 69% 71% 74% 71% 73% 69%

First-Time 74% 78% 76% 85% 78% 80% 81% 78% 80% 76%

Massachusetts Overall 72% 77% 77% 80% 79% 81% 80% 77% 78% 73%

First-Time 82% 87% 86% 89% 87% 88% 87% 83% 85% 81%

Michigan Overall 64% 78% 76% 72% 81% 80% 76% 58% 62% 64%

First-Time 75% 87% 86% 82% 89% 85% 82% 64% 69% 72%

Minnesota Overall 81% 86% 88% 87% 85% 86% 88% 85% 85% 79%

First-Time 88% 91% 93% 91% 90% 92% 93% 91% 90% 84%

Mississippi Overall 85% 80% 81% 82% 78% 76% 73% 73% 77% 79%

First-Time 88% 86% 88% 88% 85% 80% 81% 81% 85% 87%

Missouri Overall 81% 82% 84% 87% 87% 86% 89% 89% 87% 84%

First-Time 88% 88% 90% 91% 91% 90% 93% 92% 90% 87%

Montana Overall 84% 91% 89% 91% 87% 89% 90% 91% 85% 65%

First-Time 89% 92% 88% 92% 89% 93% 91% 93% 89% 70%

Nebraska Overall 73% 80% 83% 84% 78% 81% 78% 73% 74% 70%

First-Time 85% 83% 89% 89% 88% 90% 83% 83% 77% 77%

Nevada Overall 59% 61% 60% 64% 60% 59% 65% 64% 61% 57%

First-Time 68% 72% 74% 77% 73% 73% 76% 73% 73% 68%

New Hampshire Overall 54% 77% 77% 88% 84% 80% 78% 82% 71% 81%

First-Time 61% 82% 84% 88% 85% 82% 81% 84% 75% 86%

(continued)

Page 26: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

2014 Statistics

The Bar Examiner, March 201526

Ten-Year Summary of Bar Passage Rates, 2005–2014Jurisdiction 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014

New Jersey Overall 70% 73% 73% 77% 77% 76% 77% 71% 75% 71%

First-Time 77% 81% 82% 85% 84% 82% 84% 78% 79% 76%

New Mexico Overall 81% 86% 78% 85% 84% 81% 82% 84% 83% 83%

First-Time 85% 91% 83% 92% 91% 88% 88% 89% 91% 88%

New York Overall 62% 63% 64% 69% 65% 65% 64% 61% 64% 60%

First-Time 74% 77% 77% 81% 77% 76% 76% 74% 76% 73%

North Carolina Overall 64% 64% 65% 71% 67% 68% 70% 65% 59% 60%

First-Time 71% 75% 76% 83% 77% 78% 80% 79% 69% 80%

North Dakota Overall 83% 72% 69% 77% 80% 78% 83% 78% 72% 63%

First-Time 90% 83% 79% 85% 87% 84% 85% 81% 80% 65%

Ohio Overall 71% 74% 76% 79% 76% 78% 79% 76% 79% 73%

First-Time 80% 83% 86% 88% 86% 86% 86% 84% 86% 81%

Oklahoma Overall 82% 83% 85% 89% 80% 82% 83% 80% 81% 76%

First-Time 89% 91% 91% 93% 87% 89% 88% 84% 86% 84%

Oregon Overall 67% 72% 74% 71% 69% 68% 68% 72% 73% 65%

First-Time 74% 80% 81% 78% 77% 75% 78% 81% 80% 73%

Pennsylvania Overall 70% 71% 72% 77% 76% 74% 77% 73% 73% 71%

First-Time 80% 83% 83% 87% 86% 83% 85% 82% 81% 81%

Rhode Island Overall 65% 71% 75% 75% 74% 74% 69% 78% 71% 73%

First-Time 71% 77% 79% 79% 78% 79% 74% 83% 76% 77%

South Carolina Overall 80% 77% 79% 75% 72% 73% 73% 67% 75% 68%

First-Time 85% 78% 82% 82% 78% 80% 77% 73% 79% 73%

South Dakota Overall 72% 77% 85% 88% 83% 94% 94% 83% 87% 72%

First-Time 83% 85% 89% 95% 90% 99% 94% 86% 91% 75%

Tennessee Overall 74% 75% 71% 76% 68% 70% 69% 68% 73% 66%

First-Time 80% 79% 80% 83% 77% 79% 77% 73% 82% 72%

Texas Overall 71% 74% 76% 78% 78% 76% 80% 75% 80% 70%

First-Time 80% 82% 84% 84% 85% 83% 86% 82% 85% 77%

Utah Overall 86% 83% 81% 83% 83% 82% 84% 77% 82% 80%

First-Time 90% 89% 85% 87% 89% 89% 88% 82% 87% 87%

(continued)

Page 27: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

272014 Statistics

Ten-Year Summary of Bar Passage Rates, 2005–2014Jurisdiction 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014

Vermont Overall 73% 68% 66% 65% 61% 76% 68% 65% 76% 67%

First-Time 80% 78% 70% 79% 68% 87% 71% 69% 83% 75%

Virginia Overall 68% 68% 67% 73% 69% 70% 72% 69% 71% 66%

First-Time 76% 74% 76% 82% 76% 77% 79% 77% 77% 72%

Washington Overall 71% 78% 77% 73% 67% 71% 66% 64% 76% 76%

First-Time 77% 80% 78% 74% 69% 70% 67% 66% 82% 80%

West Virginia Overall 64% 60% 63% 67% 73% 65% 74% 72% 68% 73%

First-Time 71% 64% 74% 79% 81% 75% 83% 82% 76% 82%

Wisconsin Overall 77% 78% 89% 89% 89% 90% 84% 83% 83% 74%

First-Time 80% 82% 92% 92% 93% 92% 88% 86% 88% 81%

Wyoming Overall 72% 72% 62% 64% 75% 71% 62% 53% 81% 72%

First-Time 80% 74% 70% 67% 79% 75% 62% 60% 84% 78%

Guam Overall 77% 75% 76% 75% 52% 80% 67% 57% 63% 68%

First-Time 100% 70% 79% 73% 60% 90% 81% 60% 64% 77%

N. Mariana Islands Overall 100% 88% 88% 83% 100% 63% 83% 100% 92% 88%

First-Time 100% 88% 86% 83% 100% 57% 100% 100% 92% 88%

Palau Overall 71% 27% — 67% 17% 57% 25% 30% 63% 18%

First-Time 71% 27% — 50% 17% 67% 0% 38% 67% 15%

Puerto Rico Overall 38% 46% 42% 44% 41% 42% 44% 36% 40% 39%

First-Time 46% 57% 52% 52% 48% 50% 50% 45% 45% 45%

Virgin Islands Overall 69% 73% 56% 76% 65% 71% 49% 64% 61% 73%

First-Time 70% 70% 65% 84% 70% 77% 52% 70% 70% 77%

AVERAGES Overall 64% 67% 67% 71% 68% 68% 69% 67% 68% 64%

First-Time 76% 78% 79% 82% 79% 79% 79% 77% 78% 74%

(continued)

Page 28: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

2014 Statistics

The Bar Examiner, March 201528

Admissions to the Bar by Type, 2010–2014Admisson by Examination Admission on Motion/by Transferred UBE Scorea

Jurisdiction 2010 2011 2012 2013 2014 2010 2011 2012 2013 2014

Alabama 492 516 533 465 461 19 32 — 38/— 30/10

Alaska 106 70 106 103 79 19 36 44 27 37/8

Arizona 543 506 629 722 683 234 183 145 176/8 171/38

Arkansas 236 260 253 242 219 49 47 55 60 47

California 6,423 6,627 6,846 7,008 6,726 — — — — —

Colorado 1,005 1,101 1,080 1,019 914 130 155 157 185/13 245/45

Connecticut 635 531 585 564 516 15 28 83 116 81

Delaware 142 122 147 148 122 — — — — —

District of Columbia 191 194 204 92 253 2,875 2,970 2,932 3,028 2,670

Florida 3,190 3,646 3,342 3,476 3,137 — — — — —

Georgia 1,174 1,165 1,144 1,245 1,297 90 123 124 132 178

Hawaii 160 208 219 206 203 — — — — —

Idaho 149 137 183 158 132 91 73 92 63/10 71/34

Illinois 2,943 2,793 2,786 2,944 2,676 93 135 191 240 293

Indiana 618 578 625 609 565 42 65 52 66 58

Iowa 329 335 364 328 294 73 96 79 88 97

Kansas 370 356 322 316 277 47 39 116 77 94

Kentucky 486 554 476 581 475 62 91 83 87 91

Louisiana 671 744 664 533 722 — — — — —

Maine 168 157 145 152 128 4 6 20 31 48

Maryland 1,365 1,653 1,685 1,742 1,637 — — — — —

Massachusetts 2,216 2,278 2,289 2,233 1,998 162 138 174 178 194

Michigan 986 979 878 1,061 1,011 100 120 138 187 192

Minnesota 824 732 825 796 752 215 191 233 215/17 200/48

Mississippi 260 252 248 265 233 29 34 33 40 35

Missouri 861 877 922 911 899 72 88 111 115/8 138/29

Montana 150 192 200 170 112 — — — —/34 —/72

Nebraska 117 104 80 142 147 146 141 198 173/1 119/3

Nevada 373 542 550 343 319 — — — — —

aNCBE began collecting data for admission by transferred UBE score in 2013. Any persons admitted by transferred UBE score in 2011 (the first administration of the UBE, in which three jurisdictions administered the UBE) and 2012 (in which six jurisdictions administered the UBE) are included in those jurisdictions’ admission on motion numbers.

Page 29: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

292014 Statistics

Admissions to the Bar by Type, 2010–2014Admisson by Examination Admission on Motion/by Transferred UBE Scorea

Jurisdiction 2010 2011 2012 2013 2014 2010 2011 2012 2013 2014

New Hampshire 149 159 164 128 168 86 118 91 99/1 74/6

New Jersey 3,133 2,844 3,175 3,386 3,635 — — — — —

New Mexico 268 287 298 287 324 — — — — —

New York 9,649 9,309 9,046 9,698 10,273 483 546 613 553 476

North Carolina 998 1,032 1,094 997 1,102 107 69 76 94 107

North Dakota 69 67 102 85 76 70 128 185 174/8 132/28

Ohio 1,263 1,234 1,235 1,309 1,179 65 90 118 135 143

Oklahoma 380 411 510 392 328 61 54 73 71 69

Oregon 537 616 496 488 471 172 179 138 171 160

Pennsylvania 2,220 2,099 1,886 1,995 1,883 331 305 285 246 236

Rhode Island 202 185 204 201 158 — — — — —

South Carolina 466 508 526 598 469 — — — — —

South Dakota 74 74 87 91 52 18 22 23 30 22

Tennessee 700 681 668 858 709 150 140 124 153 135

Texas 2,929 3,097 2,988 3,356 2,892 328 379 408 480 533

Utah 385 545 390 424 441 67 61 53 53/22 61/43

Vermont 67 82 73 95 104 37 27 35 56 326

Virginia 1,645 1,411 1,577 1,528 1,224 60 41 43 62 98

Washington 950 923 935 1,006 910 231 225 232 318/29 484/69

West Virginia 193 224 221 208 185 66 83 73 66 53

Wisconsin 269 256 241 215 204 141 202 174 167 154

Wyoming 103 96 91 96 61 16 16 27 41/20 64/78

Guam 11 12 6 11 10 — — — — —

N. Mariana Islands 5 5 8 13 8 — 11 9 4 7

Palau 4 0 4 5 4 — — — — 7

Puerto Rico 465 557 466 491 495 — — — — —

Virgin Islands 37 23 25 23 29 — 2 — — 6

TOTALS 54,354 54,946 54,846 56,558 54,381 7,056 7,489 7,840 8,295/171 8,436/511

aNCBE began collecting data for admission by transferred UBE score in 2013. Any persons admitted by transferred UBE score in 2011 (the first administration of the UBE, in which three jurisdictions administered the UBE) and 2012 (in which six jurisdictions administered the UBE) are included in those jurisdictions’ admission on motion numbers.

(continued)

Page 30: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

2014 Statistics

The Bar Examiner, March 201530

Admissions to the Bar by Type, 2010–2014Foreign Legal Consultants

Jurisdiction 2010 2011 2012 2013 2014

Arizona 1 — 1 1 —

California 5 3 4 13 17

Colorado — — — — 1

Delaware — 1 — — —

District of Columbia 6 8 11 13 6

Florida 32 47 52 60 9

Georgia 1 — 1 2 1

Hawaii — — — — 1

Illinois 2 — — 1

Iowa 1 — — — —

Massachusetts — 1 — 1 1

Michigan — — — — 1

Minnesota — 1 1 — 2

New Jersey 1 — — — —

new Mexico — — 1 — —

New York 13 23 36 26 36

North Carolina — — — — 1

Ohio — — — — 2

Pennsylvania — 1 — — 1

South Carolina — 2 1 — —

Texas 2 4 6 8 3

virginia — — — 1 —

Washington — — 1 2 3

TOTALS 64 91 115 128 85

Admission by Diploma Privilegea

Jurisdiction 2010 2011 2012 2013 2014

New Hampshireb 14 19 20 22 22

Wisconsin 466 462 463 461 417

TOTALS 480 481 483 483 439

aDiploma privilege is defined as an admissions method that excuses students from a traditional bar examination.bIndividuals are graduates of New Hampshire’s Daniel Webster Scholar Honors Pro-gram, which is a two-year, performance-based program that includes clinical experience, portfolio review, and meetings with bar examiners.

(continued)

Page 31: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

312014 Statistics

0 1,000 2,000 3,000 4,000 5,000 6,000 7,000 8,000 9,000 10,000 11,000

AlabamaAlaska

ArizonaArkansas

CaliforniaColorado

ConnecticutDelaware

District of ColumbiaFlorida

GeorgiaHawaii

IdahoIllinois

IndianaIowa

KansasKentuckyLouisiana

MaineMaryland

MassachusettsMichigan

MinnesotaMississippi

MissouriMontana

NebraskaNevada

New HampshireNew Jersey

New MexicoNew York

North CarolinaNorth Dakota

OhioOklahoma

OregonPennsylvaniaRhode Island

South CarolinaSouth Dakota

TennesseeTexasUtah

VermontVirginia

WashingtonWest Virginia

WisconsinWyoming

GuamN. Mariana Islands

PalauPuerto Rico

Virgin Islands

By Examination On Motion By Diploma Privilege

2014 Admissions to the Bar by Examination, on Motion, and by Diploma Privilege

(Note: Some jurisdictions have relatively low percentages of on-motion admissions, which may not be easily visible in this chart. Please refer to the accompanying chart on pages 28–30 for precise numbers.)

Page 32: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

2014 Statistics

The Bar Examiner, March 201532

The National Conference of Bar Examiners has produced the Multistate Bar Examination (MBE)

since 1972. In 2014, the MBE was part of the bar examination in 54 jurisdictions.

The MBE consists of 200 multiple-choice questions in the following areas: Civil Procedure,

Constitutional Law, Contracts, Criminal Law and Procedure, Evidence, Real Property, and Torts.

The purpose of the MBE is to assess the extent to which an examinee can apply fundamental legal

principles and legal reasoning to analyze given fact patterns.

Both a raw score and a scaled score are computed for each examinee. A raw score is the number

of questions answered correctly. Raw scores from different administrations of the MBE are not

comparable, primarily due to differences in the difficulty of the questions from one administra-

tion to the next. The statistical process of equating adjusts for variations in the difficulty of the

questions, producing scaled scores that represent the same level of performance across all MBE

administrations. For instance, if the questions appearing on the July MBE were more difficult than

those appearing on the February MBE, then the scaled scores for the July MBE would be adjusted

upward to account for this difference. These adjustments ensure that no examinee is unfairly

penalized or rewarded for taking a more or less difficult exam. Each jurisdiction determines its

own policy with regard to the relative weight given to the MBE and other scores. (Jurisdictions

that administer the Uniform Bar Examination [UBE] weight the MBE component 50%.)

Page 33: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

332014 Statistics

Jurisdictions Using the MBE in 2014

Key for Jurisdictions Using the MBE in 2014

Gray shading indicates jurisdictions using the MBE. Jurisdictions not shown on the map that are included in this category: the District of Columbia, Guam, Northern Mariana Islands, Palau, and Virgin Islands.

No shading indicates jurisdictions not using the MBE. Jurisdiction not shown on the map that is included in this category: Puerto Rico.

Page 34: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

2014 Statistics

The Bar Examiner, March 201534

2014 MBE National Summary Statistics

(Based on Scaled Scores)a

February July 2014 Total

Number of Examinees 22,083 51,005 73,088

Mean Scaled Score 138.0 141.5 140.4

Standard Deviation 15.3 16.0 15.9

Maximum 187.1 187.5 187.5

Minimum 70.7 44.4 44.4

Median 138.3 142.2 141.2

0.0

2.5

5.0

7.5

10.0

12.5

15.0

85 95 105 115 125 135 145 155 165 175 185

Perc

enta

ge o

f Exa

min

ees

MBE Scaled Score

February Exam (Mean=138.0) July Exam (Mean=141.5)

2014 MBE National Score Distributionsa

MBE Scaled Scoreb

Percentage of ExamineesFebruary

(Mean = 138.0)July

(Mean = 141.5)

85 0.0 0.190 0.1 0.195 0.3 0.2100 0.6 0.4105 1.2 1.0110 1.9 1.5115 3.4 2.6120 5.4 4.1125 7.3 6.1130 9.9 8.5135 12.3 10.6140 11.6 10.5145 13.5 12.9150 11.7 11.2155 8.1 10.3160 6.3 8.4165 3.4 5.3170 1.7 4.0175 0.8 1.6180 0.2 0.7185 0.1 0.1190 0.0 0.0

aThe values reflect valid scores available electronically as of 1/21/2015. bThese data represent scaled scores in increments of 5. For example, the percentage reported for 135 includes examinees whose MBE scaled scores were between 130.5 and 135.4.

2014 MBE National Score Distributionsa

Page 35: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

352014 Statistics

Mean Scaled Scores

February July Year Total

2005 137.7 141.6 140.4

2006 137.5 143.3 141.5

2007 136.9 143.7 141.6

2008 137.7 145.6 143.3

2009 135.7 144.5 142.1

2010 136.6 143.6 141.7

2011 138.6 143.8 142.3

2012 137.0 143.4 141.6

2013 138.0 144.3 142.5

2014 138.0 141.5 140.4

MBE National Mean Scaled Scores, 2005–2014a

Number of Examinees

February July Year Total

2005 21,265 49,998 71,263

2006 22,824 51,176 74,000

2007 22,250 50,181 72,431

2008 20,822 50,011 70,833

2009 18,868 50,385 69,253

2010 19,504 50,114 69,618

2011 20,369 49,933 70,302

2012 20,695 52,337 73,032

2013 21,578 53,706 75,284

2014 22,083 51,005 73,088

MBE National Examinee Counts, 2005–2014a

0

10,000

20,000

30,000

40,000

50,000

60,000

2005 2006 2007 2008 2009 2010 2011 2012 2013 2014

MBE

Exa

min

ee C

ount

Year

February Exam July Exam

130

135

140

145

150

2005 2006 2007 2008 2009 2010 2011 2012 2013 2014

MBE

Mea

n Sc

aled

Sco

re

Year February Exam July Exam

aThe values reflect valid scores available electronically as of 1/21/2015.

Page 36: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

2014 Statistics

The Bar Examiner, March 201536

The National Conference of Bar Examiners has produced the Multistate Professional Responsibility

Examination (MPRE) since 1980. In 2014, the MPRE was required in 53 jurisdictions.

The MPRE consists of 60 multiple-choice questions whose scope of coverage includes the fol-

lowing: regulation of the legal profession; the client-lawyer relationship; client confidentiality;

conflicts of interest; competence, legal malpractice, and other civil liability; litigation and other

forms of advocacy; transactions and communications with persons other than clients; different

roles of the lawyer; safekeeping funds and other property; communications about legal services;

lawyers’ duties to the public and the legal system; and judicial conduct. The purpose of the MPRE

is to measure the examinee’s knowledge and understanding of established standards related to a

lawyer’s professional conduct.

The MPRE scaled score is a standard score. Standard scaled scores range from 50 (low) to 150

(high). The mean (average) scaled score was established at 100, based upon the performance of

the examinees who took the MPRE in March 1999. The conversion of raw scores to scaled scores

involves a statistical process that adjusts for variations in the difficulty of different forms of the

examination so that any particular scaled score will represent the same level of knowledge from

test to test. For instance, if a test is more difficult than previous tests, then the scaled scores on

that test will be adjusted upward to account for this difference. If a test is easier than previous

tests, then the scaled scores on the test will be adjusted downward to account for this difference.

The purpose of these adjustments is to help ensure that no examinee is unfairly penalized or

rewarded for taking a more or less difficult form of the test. Passing scores are established by each

jurisdiction.

Page 37: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

372014 Statistics

Jurisdictions Using the MPRE in 2014 (with Pass/Fail Standards Indicated)

Key for Jurisdictions Using the MPRE in 2014

Gray shading indicates jurisdictions using the MPRE. Jurisdictions not shown on the map that are included in this category: the District of Columbia (75), Guam (80), Northern Mariana Islands (80), Palau (75), and Virgin Islands (75).

No shading indicates jurisdictions not using the MPRE. Jurisdiction not shown on the map that is included in this category: Puerto Rico.

Page 38: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

2014 Statistics

The Bar Examiner, March 201538

0.0

2.5

5.0

7.5

10.0

12.5

15.0

17.5

20.0

22.5

25.0

27.5

50 60 70 80 90 100 110 120 130 140 150

Perc

enta

ge o

f Exa

min

ees

MPRE Scaled Score March (Mean = 93.1) August (Mean = 93.1) November (Mean = 94.5)

2014 MPRE National Score Distributionsa

MPRE Scaled Scoreb

Percentage of Examinees

March (Mean = 93.1)

August (Mean = 93.1)

November (Mean = 94.5)

50 1.8 2.5 1.4

60 6.6 5.7 6.0

70 13.2 12.8 12.4

80 20.8 21.2 20.6

90 24.1 25.2 23.9

100 16.0 15.7 16.6

110 10.9 10.5 13.5

120 5.2 5.0 3.6

130 1.3 1.1 1.9

140 0.1 0.3 0.2

150 0.0 0.0 0.0

aThe values reflect valid scores available electronically as of 1/23/2015 on both standard and alternative forms of the MPRE. bThese data represent scaled scores in increments of 10. For example, the percentage reported for 70 includes examinees whose MPRE scaled scores were between 70 and 79.

2014 MPRE National Score Distributionsa

2014 MPRE National Summary Statistics

(Based on Scaled Scores)a

March August November2014 Total

Number of Examinees 22,957 17,699 19,888 60,544

Mean Scaled Score 93.1 93.1 94.5 93.6

Standard Deviation 16.4 17.0 16.4 16.6

Maximum 149 145 150 150

Minimum 50 50 50 50

Median 94 94 94 94

Page 39: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

392014 Statistics

90

95

100

105

110

2005 2006 2007 2008 2009 2010 2011 2012 2013 2014

MPR

E M

ean

Scal

ed S

core

Year

Mar./Apr. Exam August Exam November Exam

Mean Scaled ScoresMar./Apr. Aug. Nov. Year

Total

2005 98.3 98.0 99.6 98.7

2006 98.6 96.9 98.1 98.0

2007 98.5 98.0 99.2 98.6

2008 98.9 95.6 97.9 97.6

2009 98.8 95.8 97.3 97.4

2010 97.4 95.7 97.2 96.8

2011 97.1 93.4 96.3 95.7

2012 99.3 95.8 97.2 97.6

2013 94.6 94.3 98.1 95.6

2014 93.1 93.1 94.5 93.6

MPRE National Examinee Counts, 2005–2014a

MPRE National Mean Scaled Scores, 2005–2014a

0

5,000

10,000

15,000

20,000

25,000

2005 2006 2007 2008 2009 2010 2011 2012 2013 2014

MPR

E Ex

amin

ee C

ount

Year

Mar./Apr. Exam August Exam November Exam

Number of ExamineesMar./Apr. Aug. Nov. Year

Total2005 19,869 15,703 21,716 57,288

2006 21,684 15,986 23,308 60,978

2007 21,724 17,107 23,404 62,235

2008 20,288 16,536 23,568 60,392

2009 21,755 18,085 22,483 62,323

2010 22,478 18,641 23,345 64,464

2011 22,136 19,773 24,731 66,640

2012 24,280 19,028 23,191 66,4992013 22,320 19,895 20,459 62,674

2014 22,957 17,699 19,888 60,544

aThe values reflect valid scores available electronically as of 1/23/2015 on both standard and alternative forms of the MPRE.

Page 40: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

2014 Statistics

The Bar Examiner, March 201540

The National Conference of Bar Examiners has produced the Multistate Essay Examination (MEE)

since 1988. In 2014, the MEE was used in 31 jurisdictions.

NCBE offers six 30-minute questions per administration.

The purpose of the MEE is to test the examinee’s ability to (1) identify legal issues raised by a

hypothetical factual situation; (2) separate material which is relevant from that which is not; (3)

present a reasoned analysis of the relevant issues in a clear, concise, and well-organized composi-

tion; and (4) demonstrate an understanding of the fundamental legal principles relevant to the

probable solution of the issues raised by the factual situation. The primary distinction between the

MEE and the Multistate Bar Examination (MBE) is that the MEE requires the examinee to demon-

strate an ability to communicate effectively in writing.

Areas of law that may be covered on the MEE include the following: Business Associations (Agency

and Partnership; Corporations and Limited Liability Companies), Civil Procedure, Conflict of

Laws, Constitutional Law, Contracts, Criminal Law and Procedure, Evidence, Family Law, Real

Property, Torts, Trusts and Estates (Decedents’ Estates; Trusts and Future Interests), and Uniform

Commercial Code (Secured Transactions). Some questions may include issues in more than one

area of law. The particular areas covered vary from exam to exam. Each jurisdiction determines

its own policy with regard to the relative weight given to the MEE and other scores. (Jurisdictions

that administer the Uniform Bar Examination [UBE] weight the MEE component 30%.)

Page 41: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

412014 Statistics

Jurisdictions Using the MEE in 2014

Key for Jurisdictions Using the MEE in 2014

Gray shading indicates jurisdictions using the MEE. Jurisdictions not shown on the map that are included in this category: the District of Columbia, Guam, Northern Mariana Islands, and Palau.

No shading indicates jurisdictions not using the MEE. Jurisdictions not shown on the map that are included in this category: Puerto Rico and Virgin Islands.

*Alaska began administering the MEE in July 2014.

*

Page 42: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

2014 Statistics

The Bar Examiner, March 201542

The National Conference of Bar Examiners has produced the Multistate Performance Test (MPT)

since 1997. In 2014, the MPT was used in 41 jurisdictions.

NCBE offers two 90-minute MPT items per administration. A jurisdiction may select one or both

items to include as part of its bar examination. (Jurisdictions that administer the Uniform Bar

Examination [UBE] use two MPTs as part of their bar examinations.)

The MPT is designed to test an examinee’s ability to use fundamental lawyering skills in a realis-

tic situation. Each test evaluates an examinee’s ability to complete a task that a beginning lawyer

should be able to accomplish. The MPT requires examinees to (1) sort detailed factual materials

and separate relevant from irrelevant facts; (2) analyze statutory, case, and administrative materi-

als for applicable principles of law; (3) apply the relevant law to the relevant facts in a manner

likely to resolve a client’s problem; (4) identify and resolve ethical dilemmas, when present; (5)

communicate effectively in writing; and (6) complete a lawyering task within time constraints.

Each jurisdiction determines its own policy with regard to the relative weight given to the MPT

and other scores. (Jurisdictions that administer the UBE weight the MPT component 20%.)

Page 43: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

432014 Statistics

Jurisdictions Using the MPT in 2014

Key for Jurisdictions Using the MPT in 2014

Gray shading indicates jurisdictions using the MPT. Jurisdictions not shown on the map that are included in this category: the District of Columbia, Guam, Northern Mariana Islands, and Palau.

No shading indicates jurisdictions not using the MPT. Jurisdictions not shown on the map that are included in this category: Puerto Rico and Virgin Islands.

Page 44: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

The Bar Examiner, March 201544

The revised ABA Standards and Rules of

Procedure for Approval of Law Schools

(Standards), which became effective on

August 12, 2014, are the culmination of

a comprehensive review of the Standards begun in

2008. The Comprehensive Review was undertaken

by the Standards Review Committee (SRC) of the

ABA Section of Legal Education and Admissions to

the Bar (Section) under direction of the Section Coun-

cil (Council), which is recognized by the U.S. Depart-

ment of Education as the national accrediting agency

for programs leading to the J.D. degree in American

law schools.1

The SRC is charged with reviewing proposed

changes or additions to the Standards and may also

initiate such changes. (The Department of Education

requires that all accrediting agencies periodically

review and update their standards and policies per-

taining to approval of schools and programs.) This

Comprehensive Review was undertaken in light

of recommendations from the Accreditation Policy

Task Force2—assembled in 2007 to review accredi-

tation policies and practices by taking a fresh look

from a policy perspective—and from three special

committees appointed to consider the recommenda-

tions of the Accreditation Policy Task Force and to

report their suggestions to the Council: the Special

Committees on Transparency, Security of Position,

and Outcome Measures.3 Thus, the review began

with a considerable prelude of careful thought about

needed change.

During the Comprehensive Review the SRC

held 23 public meetings, met with various interest

groups, and participated in conferences, includ-

ing those of the ABA, the Association of American

Law Schools, and the National Conference of Bar

Examiners (NCBE). The SRC agenda and draft pro-

posals and recommendations to the Council were

widely published. Hundreds of comments were

received and considered by the SRC and the Council.

Six hearings were held during the various notice and

comment periods following Council preliminary

approval. The Committee also held two open forum

meetings for interested parties to provide comments

to the Committee before making its recommenda-

tions to the Council.

The Comprehensive Review was completed

with Council approval of the SRC recommendations

at its March and June meetings in 2014, and with the

concurrence of the ABA House of Delegates at the

ABA Annual Meeting in Boston on August 11, 2014.

While the new Standards became effective after that

meeting, the Council and the Section, cognizant that

law schools will need time to do the work that some

of the revised and new Standards will require, estab-

lished a transition and implementation plan. Under

this plan, site visits in 2014–2015 will substantially

rely on the 2013–2014 Standards. The new Standards

will be applied to site visits beginning in 2015–2016,

with the exception of certain new Standards pertain-

ing to learning outcomes, curriculum changes, and

assessment methods, which will be applied begin-

the ReviSed aBa StandaRdS foR appRoval of law SchoolS:

an oveRview of the MajoR changeS

by Jeffrey E. Lewis

Page 45: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

45The Revised ABA Standards for Approval of Law Schools: An Overview of the Major Changes

ning in 2016–2017 and applied as appropriate to

students who become 1L students that year.4 The

revised Rules of Procedure, on the other hand, do not

require a delay for implementation.

This article highlights some of the major changes

to the Standards and offers examples of those changes.

Most categories of change include too many exam-

ples to cover; in these cases, the most important

changes are highlighted. Selected pertinent excerpts

from the new Standards are on pages 50–52. While

the Rules of Procedure were also substantially modi-

fied, those changes are not discussed in this article.

Note that many Standards and Interpretations were

renumbered as a result of the significant revisions

(references in this article are to the revised Standards

unless otherwise noted); the material found on the

Standards web page of the ABA Section is helpful in

that regard.5

categoRieS of MajoR changeS to the StandaRdS

Learning Outcomes: The Standards were revised to

incorporate student learning outcomes. This is one of the

most significant developments in the new Standards.

New Standards 301(b) and 302, Learning

Outcomes, introduce the requirement that law

schools establish and publish learning outcomes

designed to achieve the objectives of the program

of legal education. This development is in line

with the recommendation of the Outcome Measures

Committee to increase reliance on output measures.

It is consistent with the best practices in higher

education.

Certain minimum learning outcomes requiring

competency in four key areas are outlined, though

broadly stated to give law schools maximum flexibil-

ity. (See the sidebar on pages 50–52 for the learning

outcomes specified in Standard 302.) Interpretation

302-1 states that other professional skills in which

competency is required are to be determined by the

law school and provides a nonexclusive list of skills

that may be included, while Interpretation 302-2

allows a law school to identify any additional learn-

ing outcomes pertinent to its program.

Assessment of Student Learning: In a related develop-

ment, new Standard 314, Assessment of Student Learning,

requires that law schools use formative and summative

assessment methods to measure student learning and to

provide meaningful feedback to students.

Interpretation 314-1 provides definitions of for-

mative and summative assessment methods (see the

sidebar on pages 50–52), while Interpretation 314-2

provides for flexibility in implementing the assess-

ment requirement.

Evaluation of Program of Legal Education, Learning

Outcomes, and Assessment Methods: A further

related development is the adoption of new Standard 315,

Evaluation of Program of Legal Education, Learning

Outcomes, and Assessment Methods, which requires that

law schools evaluate their programs of legal education,

adopted learning outcomes, and assessment methods on a

regular basis.

Law schools are expected to use the results of

their evaluations to assess the degree to which stu-

dents have achieved competency in the learning out-

comes and to improve their programs accordingly.

Interpretation 315-1 gives law schools flexibility in

determining what assessment methods to use across

the curriculum. The transition and implementation

plan for the Standards provides a phase-in period

so that law schools may develop their learning out-

comes and assessment methods to be in compliance

with the new Standard.

Curriculum—Professional and Practical Training:

The revised Standards emphasize professional and practi-

cal training through new curricular requirements.

Page 46: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

The Bar Examiner, March 201546

Although the Curriculum Standard (now

Standard 303) already mandated that law schools

require each student to receive substantial instruc-

tion in the areas of professional responsibility and

professional skills, the Standard now contains a

specific graduation requirement of credit hours in

these areas. Under the revised Standard, law schools

must require each student to satisfactorily complete

one course of at least two credit hours in profes-

sional responsibility and at least six credit hours of a

course or courses in experiential learning. Simulation

courses, law clinics, and field placements qualify as

experiential learning courses as long as they involve

professional skills with multiple opportunities for

performance and self-evaluation. New Standard 304,

Simulation Courses and Law Clinics, defines and

sets out the qualification requirements for simula-

tion courses and law clinics; field placements are

addressed under modified Standard 305.

The types of “substantial opportunities” for addi-

tional professional and practical training required to

be offered by law schools in former Standard 302(b)

(for live-client or other real-life practice experiences,

student participation in pro bono activities, and

small group work) have been reworded in what is

now Standard 303(b); those substantial opportunities

are now listed as law clinics or field placements and

pro bono legal services, including law-related public

service (small group work was eliminated).

Revised Interpretation 303-3 (former

Interpretation 302-10) encouraging law schools to

promote pro bono opportunities for law students

now references pro bono legal services in the context

of the ABA Model Rules of Professional Conduct and

recommends that each student provide at least 50

hours of pro bono service during law school.

Increased Flexibility: One of the primary goals of the

Comprehensive Review of the Standards was to provide

law schools increased flexibility.

For example, one of the most significant changes

in the Standards is the elimination of the calculation

of the student-faculty ratio (former Interpretations

402-1 and 402-2). The SRC was of the view that

the ratio did not properly account for all students

enrolled in the law school and did not properly

account for the size of the faculty, given the impor-

tant changes in law school curriculum, teaching

methodologies, and administrative structures since

these two Interpretations were adopted. There are

a number of factors that can be considered in mak-

ing a functional judgment about the adequacy of

the teaching faculty without resorting to a student-

faculty ratio calculation. The elimination of the

student-faculty ratio removes from the Standards

what was widely viewed as an artificial and mis-

leading calculation that did not accurately reflect the

quality of the legal education being delivered in any

particular law school.

Experimentation and Innovation: Experimentation

and innovation are also encouraged with the new variance

process found in Standard 107.

Standard 107 distinguishes between variances in

an emergency and those sought to experiment with

a new or innovative program or concept. In making

this distinction, experimentation is encouraged. The

Council may grant a variance if it is consistent with

the purpose and objectives of the Standards overall.

The SRC was of the view that the new experimen-

tal variance rule has the potential to improve the

delivery of legal education over time, and that the

potential benefits of Council-authorized variances

outweigh any potential harm.

Admission Test: Interpretation 503-3 provides a limited

variance to the use of the LSAT in admissions.

There has long been a presumption that the LSAT

satisfied the Standard 503 requirement of a “valid

and reliable admission test.” The new Interpretation

Page 47: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

47The Revised ABA Standards for Approval of Law Schools: An Overview of the Major Changes

provides a safe harbor variance from the use of the

LSAT in very limited circumstances and for no more

than 10% of any entering class. Outside of the safe

harbor, the variance process is available to clarify

when any other alternative test admissions programs

may be employed on an experimental basis. (See the

sidebar on pages 50–52 for the circumstances of the

variance.)

Technological Developments: A number of changes

were made in response to developments in the technologi-

cal environment, two examples of which are listed below.

These changes also provide greater flexibility to law

schools.

Distance Education: Standard 306 now provides

that a law school may grant up to a total of 15 credit

hours toward the J.D. degree for distance education

courses that otherwise qualify under the Standards.

This is one-sixth of the typical law student’s law

study and an increase from 12 credit hours in the

former Standard. Importantly, while the opportunity

for distance education has been increased, the rules

governing creditable distance education are still

properly designed to ensure that the law student’s

academic experience is comparable to the traditional

classroom experience. Standard 306 was generally

updated for more clarity and now includes a clearer

definition of a distance education course.

Library Collection: Options for the format of the

law library collection have also changed consider-

ably, with a movement from the traditional book

collection to databases that are electronically avail-

able. In recognition of this development, Standard

606, which referenced a “core collection of essential

materials accessible in the law library,” has been

amended to require “a core collection of essential

materials through ownership or reliable access.”

Interpretation 606-2 further defines “reliable access”

by providing guidance on ways in which to fulfill

this requirement.

Additional Substantive Changes: A number of other

substantive changes were made to the Standards, a few

examples of which are listed below.

Granting of J.D. Degree Credit for Prior Law

Study: At least two-thirds of the credits needed

for the J.D. degree must now be obtained in an

ABA-accredited J.D. curriculum. Specifically, new

Standard 505 provides that the total credits per-

mitted for prior law study (abroad, or non-ABA-

accredited school, or LL.M. program) are limited to

one-third of the credits required for the J.D. degree.

This ends the practice of some schools to grant one

year of credit for a law school education abroad and

one year of J.D. credit for an LL.M. at their law school

toward the equivalent of three years of the required

J.D. credits. One or the other, but not both, may be

credited toward the J.D. degree.

Student Support Services: Another substantive

change was the modification of Standard 508 to pro-

vide that debt counseling is a mandatory function

for student support services. While most law schools

have long provided this counseling, it is now no

longer a matter of discretion to do so.

Qualifications for Admission to the Bar: The

substance of Standard 504 was changed to specify

that the requirement to advise students of the “char-

acter, fitness, and other qualifications for admission

to the bar in every U.S. jurisdiction” must be fulfilled

by including a statement in its application for admis-

sion and on its website. The revised Standard pro-

vides explicit language to be used for the statement

(see the sidebar on pages 50–52).

Increased Objectivity: The Standards were modified in

a number of instances to be more objective.

An important example is found in Standard

101, Basic Requirements for Approval. Specifically,

Standard 101(a) previously required that an approved

law school “shall demonstrate that its program is

Page 48: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

The Bar Examiner, March 201548

consistent with sound legal education principles.

It does so by establishing that it is being operated

in compliance with the Standards.” That language

was changed to make the Standard more objective:

an approved law school “shall demonstrate that it is

being operated in compliance with the Standards.”

Changes such as this were made throughout the

Standards.

Requirements Regarding Policies: Where law schools

are required to have policies, there is a new requirement

that they must adopt, publish, and adhere to those policies.

For example, Standard 303 previously required

a law school to “have and adhere to sound academic

standards.” The revised Standard, now Standard

308, requires law schools to “adopt, publish, and

adhere to sound academic standards.” This “adopt,

publish, and adhere to” language is used in several

places throughout the revised Standards and was

intended to achieve more objectivity as well as to

increase transparency.

Reporting Requirements: The Standards were revised

to highlight reporting requirements.

For example, old Interpretation 101-1 covering

information that must be furnished by law schools

to the Accreditation Committee and the Council

was upgraded in importance by moving it into new

Standard 104 that provides that the information

provided by a law school must be “complete, accu-

rate, and not misleading and must be submitted in

the form, manner, and time frame specified by the

Council.” Moving this requirement into the Standard

highlights the importance of providing accurate

information to the Council.

Conforming to Department of Education

Requirements: Some changes in the Standards were

required by U.S. Department of Education regulations.

For example, new Standard 310 uses the U.S.

Department of Education definition of a credit hour:

50 minutes of classroom or direct faculty instruction

plus 120 minutes of out-of-class work per week for

15 weeks (including one week for a final exam). In

other words, a total of 170 minutes per week for 15

weeks of instruction (including one week of exams)

qualify for one academic credit. The Standard also

provides some alternate ways of determining the

time; it refers to an “equivalent amount of work over

a different amount of time.” This represents a shift

from the use of minutes to the use of the concept

of a credit hour to describe the various curricular

requirements of the Standards.

Required Disclosures: Revised Standard 509 builds on

recent Council action strengthening the reporting require-

ments for consumer information.6

Law school reporting of information such as

employment and conditional scholarships must now

be accomplished through prescribed charts. This

makes it possible for the prospective law student to

achieve a reliable and accurate comparison between

law schools on these important considerations. Law

school reporting of other required information (such

as academic requirements and transfer of credit) is

not susceptible to a specific format but must be dis-

closed in a “readable and comprehensive manner.”

Elimination of Certain Standards and Interpre-

tations: Several Standards and Interpretations were

eliminated because they were seen as being unenforceable,

unnecessary, unclear, or repetitive.

Most notable was the elimination of the 20-hour

limitation on employment for full-time students

(former Standard 304(f)); this was viewed as funda-

mentally unenforceable.

Redesignation of Certain Interpretations as

Standards: The SRC identified a number of Interpre-

tations that were, in their substance, Standards and not

Page 49: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

49The Revised ABA Standards for Approval of Law Schools: An Overview of the Major Changes

mere interpretations of Standards. They were, therefore,

redesignated as Standards. Such redesignation was one of

the most significant changes to the Standards.

For example, former Interpretation 305-4(a) pro-

vided that “[a] law school that has a field placement

program shall develop, publish and communicate

to students and field instructors a statement that

describes the educational objectives of the program.”

This is now Standard 305(f) (with minor modifica-

tions). (Other examples are listed under other catego-

ries of change in this article.)

StandaRdS SuBject to futuRe evaluation

Several issues addressed by the SRC and the

Council were highly controversial, and in the end

no changes were made:

• Bar Passage: The SRC proposed changes to the

bar passage provision (former Interpretation

301-6, regarding determination of the suffi-

ciency of a law school’s bar passage rate in

meeting the objectives of its program of legal

education to prepare its students for admission

to the bar). Most significant was the proposal

to require a law school to report bar examina-

tion results for all its graduates known to have

taken a bar examination within two calendar

years from graduation. The proposals were

stalled in part due to complaints from law

schools that obtaining information regarding

bar passage for all students was a difficult task

for law schools. Former Interpretation 301-6 was

moved, unchanged, to new Standard 316.

• Professional Environment: The attempt to clarify

the requirements regarding tenure also failed to

pass. Standard 405, Professional Environment,

governs the status and security of position for

law faculty. Alternative proposals to modify

this Standard generated substantial criticism

from law school faculty during the comment

period.7 Since no proposal for change garnered

the approval of a majority of the Council, cur-

rent Standard 405 remains in place.

• Credit for Compensated Field Placement

Programs: Another controversial issue was

the prohibiting of law schools from grant-

ing credit for field placement programs for

which the student receives compensation (for-

mer Interpretation 305-3, now Interpretation

305-2). Retention of this Interpretation was rec-

ommended by the Council, but it was referred

back to the Council after the House of Delegates

heard strong testimony for and against the pro-

vision. The House concurred in all of the pro-

posed new Standards and Rules of Procedure

with the exception of Interpretation 305-2. This

Interpretation remains in place pending further

review by the Council.

Several other matters raised during the

Comprehensive Review will continue to be studied.

For example, one remaining issue is whether certain

groups currently covered by the non-discrimination

Standard, such as those with disabilities or certain

sexual orientation characteristics, should also be

included in the Standard that requires law schools

to demonstrate by concrete action a commitment to

diversity and inclusion.

concluSion

Overall, as a result of the changes to the Standards,

programs of legal education in American law schools

will remain rigorous, while at the same time becom-

ing more practical and skills-focused. There will be

a greater focus on outcomes (such as learning out-

comes, bar exam results, and employment rates). The

revised Standards also require increased reporting of

consumer information for greater transparency. In

(text continues on page 53)

Page 50: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

The Bar Examiner, March 201550

exceRptS fRoM the 2014–2015 StandaRdS and RuleS of pRoceduRe foR appRoval of law SchoolS

CHAPTER 3: PROGRAM OF LEGAL EDUCATION

Standard 301. OBJECTIVES OF PROGRAM OF LEGAL EDUCATION

(a) A law school shall maintain a rigorous program of legal education that prepares its students, upon graduation, for admission to the bar and for effective, ethical, and responsible participation as members of the legal profession.

(b) A law school shall establish and publish learning outcomes designed to achieve these objectives.

Standard 302. LEARNING OUTCOMES

A law school shall establish learning outcomes that shall, at a mini-mum, include competency in the following:

(a) Knowledge and understanding of substantive and procedural law;

(b) Legal analysis and reasoning, legal research, problem-solving, and written and oral communication in the legal context;

(c) Exercise of proper professional and ethical responsibilities to cli-ents and the legal system; and

(d) Other professional skills needed for competent and ethical par-ticipation as a member of the legal profession.

Interpretation 302-1

For the purposes of Standard 302(d), other professional skills are determined by the law school and may include skills such as, interviewing, counsel-ing, negotiation, fact development and analysis, trial practice, document drafting, conflict resolution, organization and management of legal work, collaboration, cultural competency, and self-evaluation.

Interpretation 302-2

A law school may also identify any additional learning outcomes pertinent to its program of legal education.

Standard 303. CURRICULUM

(a) A law school shall offer a curriculum that requires each student to satisfactorily complete at least the following:

(1) one course of at least two credit hours in professional respon-sibility that includes substantial instruction in the history, goals, structure, values, and responsibilities of the legal profession and its members;

(2) one writing experience in the first year and at least one addi-tional writing experience after the first year, both of which are faculty supervised; and

(3) one or more experiential course(s) totaling at least six credit hours. An experiential course must be a simulation course, a law clinic, or a field placement. To satisfy this requirement, a course must be primarily experiential in nature and must:

(i) integrate doctrine, theory, skills, and legal ethics, and engage students in performance of one or more of the professional skills identified in Standard 302;

(ii) develop the concepts underlying the professional skills being taught;

(iii) provide multiple opportunities for performance; and

(iv) provide opportunities for self-evaluation.

(b) A law school shall provide substantial opportunities to students for:

(1) law clinics or field placement(s); and

(2) student participation in pro bono legal services, including law-related public service activities.

Interpretation 303-3

Rule 6.1 of the ABA Model Rules of Professional Conduct encourages lawyers to provide pro bono legal services primarily to persons of limited means or to organizations that serve such persons. In addition, lawyers are encouraged to provide pro bono law-related public service. In meeting the requirement of Standard 303(b)(2), law schools are encouraged to promote opportunities for law student pro bono service that incorporate the priori-ties established in Model Rule 6.1. In addition, law schools are encouraged to promote opportunities for law students to provide over their law school career at least 50 hours of pro bono service that complies with Standard 303(b)(2). Pro bono and public service opportunities need not be structured to accomplish any of the outcomes required by Standard 302. Standard 303(b)(2) does not preclude the inclusion of credit-granting activities within a law school’s overall program of law-related pro bono opportunities so long as law-related non-credit bearing initiatives are also part of that program.

Standard 304. SIMULATION COURSES AND LAW CLINICS

(a) A simulation course provides substantial experience not involv-ing an actual client, that (1) is reasonably similar to the experience of a lawyer advising or representing a client or engaging in other law-yering tasks in a set of facts and circumstances devised or adopted by a faculty member, and (2) includes the following:

(i) direct supervision of the student’s performance by the faculty member;

(ii) opportunities for performance, feedback from a faculty member, and self-evaluation; and

(iii) a classroom instructional component.

(b) A law clinic provides substantial lawyering experience that (1) involves one or more actual clients, and (2) includes the following:

(i) advising or representing a client;

(ii) direct supervision of the student’s performance by a faculty member;

(iii) opportunities for performance, feedback from a faculty member, and self-evaluation; and

(iv) a classroom instructional component.

Standard 306. DISTANCE EDUCATION

(a) A distance education course is one in which students are sepa-rated from the faculty member or each other for more than one-third of the instruction and the instruction involves the use of technology to support regular and substantive interaction among students and between the students and the faculty member, either synchronously or asynchronously.

Page 51: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

51The Revised ABA Standards for Approval of Law Schools: An Overview of the Major Changes

(b) Credit for a distance education course shall be awarded only if the academic content, the method of course delivery, and the method of evaluating student performance are approved as part of the school’s regular curriculum approval process.

(c) A law school shall have the technological capacity, staff, information resources, and facilities necessary to assure the edu-cational quality of distance education.

(d) A law school may award credit for distance education and may count that credit toward the 64 credit hours of regularly sched-uled classroom sessions or direct faculty instruction required by Standard 310(b) if:

(1) there is opportunity for regular and substantive interaction between faculty member and student and among students;

(2) there is regular monitoring of student effort by the faculty member and opportunity for communication about that effort; and

(3) the learning outcomes for the course are consistent with Standard 302.

(e) A law school shall not grant a student more than a total of 15 credit hours toward the J.D. degree for courses qualifying under this Standard.

(f) A law school shall not enroll a student in courses qualifying for credit under this Standard until that student has completed instruction equivalent to 28 credit hours toward the J.D. degree.

(g) A law school shall establish an effective process for verifying the identity of students taking distance education courses and that also protects student privacy. If any additional student charges are associated with verification of student identity, students must be notified at the time of registration or enrollment.

Interpretation 306-1

Technology used to support a distance education course may include, for example:

(a) The Internet;

(b) One-way and two-way transmissions through open broadcast, closed circuit, cable, microwave, broadband lines, fiber optics, satellite, or wireless communications devices;

(c) Audio and video conferencing; or

(d) Video cassettes, DVDs, and CD-ROMs, if the cassettes, DVDs, or CD–ROMs are used in a course in conjunction with any of the technologies listed in paragraphs (a) through (c).

Interpretation 306-2

Methods to verify student identity as required in Standard 306(g) include, but are not limited to (i) a secure login and pass code, (ii) proc-tored examinations, and (iii) other technologies and practices that are effective in verifying student identity. As part of the verification process, a law school shall verify that the student who registers for a class is the same student that participates and takes any examinations for the class.

Standard 314. ASSESSMENT OF STUDENT LEARNING

A law school shall utilize both formative and summative assess-ment methods in its curriculum to measure and improve student learning and provide meaningful feedback to students.

Interpretation 314-1

Formative assessment methods are measurements at different points during a particular course or at different points over the span of a student’s education that provide meaningful feedback to improve student learning. Summative assessment methods are measurements at the culmination of a particular course or at the culmination of any part of a student’s legal education that mea-sure the degree of student learning.

Interpretation 314-2

A law school need not apply multiple assessment methods in any par-ticular course. Assessment methods are likely to be different from school to school. Law schools are not required by Standard 314 to use any particular assessment method.

Standard 315. EVALUATION OF PROGRAM OF LEGAL EDUCATION, LEARNING OUTCOMES, AND ASSESSMENT METHODS

The dean and the faculty of a law school shall conduct ongoing evaluation of the law school’s program of legal education, learn-ing outcomes, and assessment methods; and shall use the results of this evaluation to determine the degree of student attainment of competency in the learning outcomes and to make appropriate changes to improve the curriculum.

Interpretation 315-1

Examples of methods that may be used to measure the degree to which students have attained competency in the school’s student learning out-comes include review of the records the law school maintains to measure individual student achievement pursuant to Standard 314; evaluation of student learning portfolios; student evaluation of the sufficiency of their education; student performance in capstone courses or other courses that appropriately assess a variety of skills and knowledge; bar exam passage rates; placement rates; surveys of attorneys, judges, and alumni; and assessment of student performance by judges, attorneys, or law professors from other schools. The methods used to measure the degree of student achievement of learning outcomes are likely to differ from school to school and law schools are not required by this standard to use any particular methods.

CHAPTER 5: ADMISSIONS AND STUDENT SERVICES

Standard 503. ADMISSION TEST

Interpretation 503-3

(a) It is not a violation of this Standard for a law school to admit no more than 10% of an entering class without requiring the LSAT from:

(1) Students in an undergraduate program of the same institution as the J.D. program; and/or

(2) Students seeking the J.D. degree in combination with a degree in a different discipline.

(b) Applicants admitted under subsection (a) must meet the following conditions:

(1) Scored at or above the 85th percentile on the ACT or SAT for purposes of subsection (a)(1), or for purposes of subsection (a)(2), scored at or above the 85th percentile on the GRE or GMAT; and

(sidebar continues on page 52)

Page 52: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

The Bar Examiner, March 201552

(2) Ranked in the top 10% of their undergraduate class through six semesters of academic work, or achieved a cumulative GPA of 3.5 or above through six semesters of academic work.

Standard 504. QUALIFICATIONS FOR ADMISSION TO THE BAR

(a) A law school shall include the following statement in its appli-cation for admission and on its website:

In addition to a bar examination, there are character, fitness, and other qualifications for admission to the bar in every U.S. jurisdic-tion. Applicants are encouraged to determine the requirements for any jurisdiction in which they intend to seek admission by contacting the jurisdiction. Addresses for all relevant agencies are available through the National Conference of Bar Examiners.

(b) The law school shall, as soon after matriculation as is prac-ticable, take additional steps to apprise entering students of the importance of determining the applicable character, fitness, and other requirements for admission to the bar in each jurisdiction in which they intend to seek admission to the bar.

Standard 505. GRANTING OF J.D. DEGREE CREDIT FOR PRIOR LAW STUDY

(a) A law school may admit a student and grant credit for courses completed at another law school approved by the Council if the courses were undertaken as a J.D. degree student.

(b) A law school may admit a student and grant credit for courses completed at a law school in the United States that is not approved by the Council if graduates of the law school are permitted to sit for the bar examination in the jurisdiction in which the school is located, provided that:

(1) the courses were undertaken as a J.D. degree student; and

(2) the law school would have granted credit toward satisfac-tion of J.D. degree requirements if earned at the admitting school.

(c) A law school may admit a student and grant credit for courses completed at a law school outside the United States if the admit-ting law school would have granted credit towards satisfaction of J.D. degree requirements if earned at the admitting school.

(d) A law school may grant credit toward a J.D. degree to a gradu-ate of a law school in a country outside the United States for credit hours earned in an LL.M. or other post-J.D. program it offers if:

(1) that study led to successful completion of a J.D. degree course or courses while the student was enrolled in a post-J.D. degree law program; and

(2) the law school has a grading system for LL.M. students in J.D. courses that is comparable to the grading system for J.D. degree students in the course.

(e) A law school that grants credit as provided in Standard 505(a) through (d) may award a J.D. degree to a student who success-fully completes a course of study that satisfies the requirements of Standard 311 and that meets all of the school’s requirements for the awarding of the J.D. degree.

(f) Credit hours granted pursuant to subsection (b) through (d) shall not, individually or in combination, exceed one-third of the total required by the admitting school for its J.D. degree.

Standard 509. REQUIRED DISCLOSURES

(a) All information that a law school reports, publicizes, or distrib-utes shall be complete, accurate and not misleading to a reason-able law school student or applicant. A law school shall use due diligence in obtaining and verifying such information. Violations of these obligations may result in sanctions under Rule 16 of the Rules of Procedure for Approval of Law Schools.

(b) A law school shall publicly disclose on its website, in the form and manner and for the time frame designated by the Council, the following information:

(1) admissions data;

(2) tuition and fees, living costs, and financial aid;

(3) conditional scholarships;

(4) enrollment data, including academic, transfer, and other attrition;

(5) numbers of full-time and part-time faculty, professional librarians, and administrators;

(6) class sizes for first-year and upper-class courses; number of seminar, clinical and co-curricular offerings;

(7) employment outcomes; and

(8) bar passage data.

(c) A law school shall publicly disclose on its website, in a read-able and comprehensive manner, the following information on a current basis:

(1) refund policies;

(2) curricular offerings, academic calendar, and academic requirements; and

(3) policies regarding the transfer of credit earned at another institution of higher education. The law school’s transfer of credit policies must include, at a minimum:

(i) A statement of the criteria established by the law school regarding the transfer of credit earned at an-other institution; and

(ii) A list of institutions, if any, with which the law school has established an articulation agreement.

(d) A law school shall distribute the data required under Standard 509(b)(3) to all applicants being offered conditional scholarships at the time the scholarship offer is extended.

(e) If a law school makes a public disclosure of its status as a law school approved by the Council, it shall do so accurately and shall include the name and contact information of the Council.

Source: American Bar Association Section of Legal Education and Admissions to the Bar, Standards and Rules of Procedure for Approval of Law Schools (2014–2015), available at http://www.americanbar.org/groups/legal_education/resources/standards.html.

Page 53: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

53The Revised ABA Standards for Approval of Law Schools: An Overview of the Major Changes

addition, the revisions address many of the changes

that have occurred in legal education since the

last Comprehensive Review. Finally, the revisions

respond to changes and requirements in the U.S.

Department of Education regulations, streamline

the sabbatical review process, strengthen curricular

requirements, and strengthen the reporting require-

ments for consumer information.

It should be noted that the Comprehensive

Review occurred during a period of dramatic change

in the legal profession and legal education—a transi-

tion from high enrollments and bountiful employ-

ment opportunities to reduced enrollments and a

contraction of the job market. The Standards, sub-

stantially improved against the backdrop of these

stresses and strains on the legal profession and legal

education, are destined to strengthen the quality of

American legal education as we go forward.

noteS1. Editor’s Note: For a summary of the Standards review pro-

cess, the goals of accreditation, and critical issues encom-passed in the current Comprehensive Review, see Donald J. Polden, Comprehensive Review of American Bar Association Law School Accreditation Policies and Procedures: A Summary, 79(1) the BaR exaMineR 42–49 (February 2010).

2. The Report of the Accreditation Policy Task Force is avail-able at http://www.americanbar.org/content/dam/aba/migrated/legaled/actaskforce/2007_05_29_report_ accreditation_task_force.authcheckdam.pdf.

3. Information about the charges of these three commit-tees, as well as their final reports, are available on the Special Committees Report web page of the ABA Section of Legal Education and Admissions to the Bar, http://www .americanbar.org/groups/legal_education/committees/ standards_review/comp_review_archive/special_ committee_reports.html (last visited Feb. 13, 2014).

4. For details of the transition and implementation plan, see Transition to and Implementation of the New Standards and Rules of Procedure for Approval of Law Schools,

August 13, 2014, available at http://www.americanbar.org/content/dam/aba/administrative/legal_education_and_ admissions_to_the_bar/governancedocuments/2014_august_transition_and_implementation_of_new_aba_ standards_and_rules.authcheckdam.pdf.

5. The 2014–2015 Standards and Rules of Procedure for Approval of Law Schools are available on the Standards web page of the ABA Section of Legal Education and Admissions to the Bar, http://www.americanbar.org/groups/legal_ education/resources/standards.html. This web page also includes an overview and detailed explanation of the changes to the Standards, as well as a redline version of the revised Standards.

6. The major substantive revisions to Standard 509 went into effect in August 2013.

7. For a summary of the proposals to modify Standard 405, see American Bar Association Section of Legal Education and Admissions to the Bar, Explanation of Changes, avail-able at http://www.americanbar.org/content/dam/aba/ administrative/legal_education_and_admissions_to_the_bar/council_reports_and_resolutions/201408_explanation_changes.authcheckdam.pdf.

jeffRey e. lewiS is Dean Emeritus and Professor of Law at the Saint Louis University School of Law, where he served as dean from 1999 to 2010. He served on the law faculty at the University of Florida College of Law from 1972 to 1999, and during his tenure at the University of Florida he served as associate dean for seven years and dean for eight years. Dean Lewis served as chair of the American Bar Association Standards Review Committee from August 2011 to August 2014. He has served on the Council of the ABA Section of Legal Education and Admissions to the Bar; he has also chaired the ABA Accreditation Committee, served on the Accreditation Committee of the Association of American Law Schools, and chaired or served as a member of over 20 ABA/AALS site evaluation teams.

Page 54: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

The Bar Examiner, March 201554

As I write this column, bar

exam graders across the

country are in some stage

of grading essays and per-

formance tests. Every U.S. jurisdiction is

responsible for grading the written com-

ponent of its bar examination—whether

the written component consists of the

Multistate Essay Examination (MEE),

the Multistate Performance Test (MPT),

jurisdiction-drafted questions, or some

combination of two or all three. Grading the written

portion of the bar examination is a painstaking process

that accounts for at least half of an examinee’s grade—

thus a significant component of the overall bar exam

score. This column focuses on some essay (and perfor-

mance test) grading fundamentals: rank-ordering, cal-

ibration, and taking into account an examinee’s ability

to communicate in writing. Adhering to these funda-

mentals helps ensure fair and reliable essay grading

procedures and score results.

First, a few words are in order about the role that

equating plays in the overall context of grading. As

stated many times in this column and elsewhere in

the Bar Examiner, the purpose of the bar examination

is to determine minimal competence to be licensed as

an attorney. Both fairness to examinees and protec-

tion of the public dictate that the bar exam be reliable

and valid across test forms and administrations. The

Multistate Bar Examination (MBE) is the only part of

the bar exam that is equated across all administrations.

This is done by embedding a mini test form within the

MBE with known statistical properties that is then

compared between the control group

and current test takers. This equating

process ensures comparable score mean-

ing across MBE administrations.

But what about equating the MEE

and the MPT? These tests cannot be

equated in the same sense that the MBE

is equated because their questions are

too memorable to be reused or embed-

ded in an exam—examinees spend 30

minutes on a given MEE question and 90 minutes on a

given MPT question, as opposed to just a few minutes

on an MBE question. Any examinee who had seen an

MEE or MPT question before would remember it and

have an advantage over an examinee who had never

seen the question. (Once an MEE or MPT is admin-

istered, none of its questions is ever used again on

another test form. Retired questions are made avail-

able for purchase or free of charge on our website as

study aids or for use in law schools.)

Because MEEs and MPTs cannot be equated in the

same way as the MBE, but are a critical piece of the

bar exam score, NCBE recommends the best practice

of scaling the written scores to the MBE: raw scores

earned on each MEE and MPT question are added

up and then scaled to the MBE. This in effect puts

the overall score earned on the written portion of the

exam on the MBE scaled score distribution, thereby

using the equating power of the MBE to give compa-

rability to the written portion. Scaling preserves the

important rank-ordering judgments that graders have

made on answers.1

the teSting coluMneSSay gRading fundaMentalS

by Judith A. Gundersen

Page 55: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

55The Testing Column

Rank-oRdeRing papeRS

MEE and MPT questions are developed to test con-

tent and skills set forth in the MEE subject matter

outline and the MPT list of skills tested. Within each

MEE and MPT, multiple issues are raised that might

be addressed by examinees—some issues are easier

to identify and some are subtler. Multiple issues

help graders make meaningful grading distinctions

among papers. Some papers should get high scores,

some average scores, and some lower scores, regard-

less of what score scale a jurisdiction uses (1–5, 1–6,

1–10, etc.), and regardless of whether, taken as a

whole, all papers are strong or weak. What matters is

rank-ordering among papers—relative grading.

Rank-ordering works best if distinctions are made

between papers and scores are spread out over the

whole score scale (whatever that may be). For exam-

ple, if a jurisdiction uses a 1–6 scale (a “1” paper being

a very poor answer relative to the other answers in the

jurisdiction, and a “6” paper being an excellent answer

relative to the other answers in the jurisdiction), it is

important that graders assign 1’s, 2’s, 3’s, 4’s, 5’s, and

6’s, not just compress all of their grades between 3’s

and 4’s. Were a grader to give every answer in her

group of papers a “3,” for example, the question would,

in effect, be thrown out—it would have no impact on

examinees’ scores. It would be like keying all answers

correct in a multiple-choice question. Similarly, but to

a lesser degree, bunching all grades between just two

of the points on a 6-point scale would diminish the

relative value that this particular question would have

on an examinee’s overall written score.

To prepare graders, NCBE provides detailed grad-

ing materials, which are subjected to review by outside

content experts, editing by drafting committees, and

proofing and cite-checking by NCBE lawyer-editors.

User jurisdictions also have the option of reviewing

the questions and grading materials before adminis-

tration. NCBE hosts an MEE/MPT grading workshop

after each administration, with three participation

options for graders: in person, by conference call, or

via on-demand streaming. Finally, the grading mate-

rials are included in MEE and MPT study aids, so

prospective examinees can become familiar with the

questions and what graders are looking for in exam-

inee answers.

Rank-ordering papers is harder when a grader

perceives that the answers are all very good or all very

poor. But meaningful distinctions between papers

can and should be made no matter whether a paper

evidences a weak or strong performance. That is, a

grader should take into account an examinee’s use of

the facts, the quality and depth of the examinee’s legal

analysis, the examinee’s issue-spotting ability, and

the quality of the examinee’s writing (more on this

later). Considering each paper as a whole, informed

by the grading materials, rank-ordering papers using

the entire score scale will best ensure that examinees’

written scores reflect their performance on this portion

of the exam.

achieving and Maintaining gRading conSiStency: caliBRation

Whether a grader grades all the answers to a certain

question himself or with other graders, getting and

staying calibrated is critical. Calibration is the pro-

cess by which a grader or group of graders develops

coherent and identifiable grading judgments so that

the rank-ordering is consistent throughout the grad-

ing process and across multiple graders. It shouldn’t

matter to an examinee if her answer is paper number

1 for grader A or paper number 233 for grader B.

To calibrate, graders begin by reading a set of 10 or

more common papers and assigning tentative grades.

Multiple graders compare their grades on the sample

group and see where they need to resolve grading

judgments. Once any differences between grading

judgments are worked out, then another sample group

of 10 papers should be read to see if the graders are in

alignment. Again, grading differences on this second

set of sample papers must be resolved. Finally, a third

set of 10 common papers might be necessary to ensure

Page 56: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

The Bar Examiner, March 201556

that graders are grading consistently. If the total num-

ber of examinees or papers to be graded in an admin-

istration reaches the hundreds or thousands, it might

be a good idea to embed a few common papers among

multiple graders, those papers then being checked to

ensure that consistency is maintained over the course

of the grading process.

Single graders should also start with a defined

set of papers to gauge what the pool of answers will

look like and assign tentative grades until they’ve

seen more papers. Because grading is relative and

papers are to be rank-ordered, context is everything.

Early grades will probably need rechecking as more

answers are read. Some graders find it helpful to keep

benchmark papers—representative papers for each

point on the score scale—to help re-orient themselves

after a grading break. It may also be helpful for a

grader or graders to try to put papers in buckets or

piles representing each point on the score scale to

ensure that they are, in fact, using the whole score

scale and not bunching all answers between two

points on their score scale.

taking into account exaMineeS’ aBility to coMMunicate in wRiting

One way for graders to make distinctions between

papers is to take into consideration examinees’ abil-

ity to communicate in writing—this is a construct

of the MEE and MPT and is set forth in the purpose

statement of the MEE and the skills tested in the

MPT. A lawyer’s ability to communicate in writing

is a critical lawyering skill. NCBE’s 2012 job analy-

sis confirmed this—100% of all respondents to the

survey we distributed to new lawyers stated that the

ability to communicate in writing was “extremely

significant” to their jobs as lawyers.2 If writing didn’t

matter, then the bar exam could consist solely of

multiple-choice questions—which would save a lot

of time and effort. But it does matter.

Demonstrating the ability to communicate in writ-

ing does not mean using legalese or jargon. Rather, it

means writing a well-organized paper that demon-

strates an understanding of the law and how to apply

it to the facts in the problem. It means, as stated in the

MEE instructions, “show[ing] . . . the reasoning by

which you arrive at your conclusions.”

The MPT has more specific criteria for assessing

the quality of an examinee’s writing than the MEE,

as MPT examinees are instructed on the proper tone

for the assignment (e.g., persuasive, objective), the

proper audience (e.g., court, client, opposing counsel),

and sometimes the desired formatting (e.g., the use of

headings, statement of facts, case citations). Thus, in

general, it can be easier for graders to make distinc-

tions on the quality of writing when grading MPTs.

However, graders can make a meaningful assessment

of writing ability on both the MPT and the MEE.

concluSion

Graders have an important job, and they know it.

I’ve met hundreds of graders over the years, and

they all strive to make consistent and fair decisions,

and take their jobs very seriously. Employing the

practices and principles of rank-ordering, achieving

and maintaining calibration, and assessing written

communication ensures a fair and reliable process

for grading the all-important written portion of the

bar examination.

noteS

1. For a detailed explanation about scaling, see the December 2014 Testing Column: Mark A. Albanese, Ph.D., The Testing Column: Scaling: It’s Not Just for Fish or Mountains, 83(4) the BaR exaMineR 50–56 (December 2014).

2. The NCBE job analysis is part of a content validity study con-ducted by NCBE in conjunction with its testing program. The job analysis was carried out through a survey distributed to a diverse group of lawyers from across the country who had been in practice from one to three years. Its goal was to deter-mine what new lawyers do, and what knowledge, skills, and abilities newly licensed lawyers believe that they need to carry out their work. The job analysis, entitled A Study of the Newly Licensed Lawyer, is available on the NCBE website at http://www.ncbex.org/publications/ncbe-job-analysis/.

judith a. gundeRSen is the Director of Test Operations for the National Conference of Bar Examiners.

Page 57: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

The Bar Examiner, March 201560

litigation updateby Fred P. Parker III and Jessica Glad

caSeS RepoRted

aBa-appRoved law SchoolS

Attorney from non-ABA-approved law school

In the Matter of Odegua J. Irivbogbe, Applicant to the West Virginia Board of

Law Examiners, 2014 WL 2404312 (WV)

attoRney diScipline

Unauthorized practice of law

In re Seth Cortigene and Newton B. Schwartz, Sr., 144 So. 3d 915 (LA 2014)

chaRacteR and fitneSS

Financial irresponsibility

In the Matter of the Application of T. Z.-A. O. for Admission to the Bar of

Maryland, 441 Md. 65, 105 A.3d 492 (MD 2014)

aBa-appRoved law SchoolS

Attorney from non-ABA-approved law school

In the Matter of Odegua J. Irivbogbe, Applicant to the West Virginia Board

of Law Examiners, 2014 WL 2404312 (WV)

Odegua Irivbogbe graduated from the University of

Lagos in Nigeria, passed the New York bar exam in

2007, and was admitted to the New York Bar in 2008.

She never practiced in New York. She moved to West

Virginia and filed an application in July 2012 with the

West Virginia Board seeking admission by examina-

tion under the Board’s Rule 3.0. The Board denied

the application based on Irivbogbe’s failure to meet

the educational requirements of Rules 2.0 and 3.0.

The Board found that Rule 3.0(b)(4) requires that a

graduate of a foreign law school where the common

law of England exists as the basis for its jurisprudence

must successfully complete 30 basic credit hours at an

ABA-approved law school in order to sit for the West

Virginia bar exam, and Irivbogbe had not completed

these credits.

Page 58: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

61Litigation Update

Irivbogbe then requested an administrative hear-

ing, which was held in February 2013. The hearing

examiner concluded that the Board’s decision must

be affirmed. The Board reviewed this report in June

2013 and voted to deny Irivbogbe’s application

based on her failure to meet the educational require-

ments in Rules 2.0 and 3.0 because the Rules do not

allow the Board any discretion to waive or modify

these requirements.

The matter was appealed to the West Virginia

Supreme Court. The Court reviewed the record de

novo with regard to questions of law, the application

of the law to the facts, and whether the applicant

should or should not be admitted to the practice of

law in West Virginia.

Irivbogbe argued that her legal education and

the study materials she had used in preparing for the

New York bar exam were equivalent to an education

received at an ABA-approved law school and that

she should be allowed to sit for the West Virginia bar

exam. She further asserted that the Board had misap-

plied rule 3.0(b)(4) and that she should be allowed

to sit pursuant to Rule 3.0(b)(1), which applies to

graduates of non-ABA-approved law schools who

have passed the bar exam in another state and have

been admitted in that state. This argument had been

rejected by the Board because that rule applies only

to graduates of U.S. law schools.

The Court stated that the main issue on appeal

was whether the Board had correctly concluded

that Rule 3.0(b)(4) applies to all foreign law school

graduates. The Court agreed with the Board that all

foreign-educated applicants must successfully com-

plete a minimum of 30 credit hours of basic courses

selected from certain listed areas of law. Since

Irivbogbe had not done this, she was not currently

eligible for admission to practice in West Virginia by

examination.

Irivbogbe also argued that a denial would vio-

late her right to equal protection. The Board had

found that Irivbogbe was not similarly situated to

applicants who were educated at ABA-approved law

schools. The Court agreed and affirmed the Board’s

decision.

attoRney diScipline

Unauthorized practice of law

In re Seth Cortigene and Newton B. Schwartz, Sr., 144 So. 3d 915 (LA 2014)

In a matter of first impression, the Supreme Court of

Louisiana considered whether it has the authority to

impose discipline on a lawyer not admitted to the bar

of Louisiana.

This case arose from consolidated disciplinary

proceedings resulting from formal charges filed by

the Office of Disciplinary Counsel (“ODC”) against

attorneys Seth Cortigene and Newton B. Schwartz,

Sr. Cortigene was licensed to practice law in Texas

and Louisiana but was currently ineligible to practice

due to his failure to comply with his professional

obligations. Schwartz was licensed to practice law

in Texas and Pennsylvania, but was not licensed in

Louisiana. The ODC charged Schwartz with engag-

ing in the unauthorized practice of law by appearing

at and participating in a Louisiana deposition. The

ODC charged Cortigene with facilitating Schwartz’s

misconduct and failing to report it to disciplinary

authorities. The matter proceeded to a hearing on the

formal charges filed against both attorneys.

Page 59: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

The Bar Examiner, March 201562

The hearing committee concluded that both

attorneys had violated the Rules of Professional

Conduct as charged. It recommended that Cortigene

be disbarred. However, because Schwartz was not

admitted to the Louisiana bar, the hearing commit-

tee declined to recommend disbarment for him, and

instead recommended that Schwartz be publicly

reprimanded and permanently enjoined from the

practice of law in the state.

The disciplinary board adopted the hearing com-

mittee’s factual findings, its legal conclusions, and

most of its sanctions recommendations. It agreed

that disbarment would be the appropriate sanction

for Schwartz’s misconduct, but reasoned that disbar-

ment was inapplicable to an attorney not admitted

to the Louisiana bar. Instead, the board concluded

that the only sanction applicable to a non-Louisiana

attorney would be a public reprimand. The ODC

appealed to the Supreme Court of Louisiana, assert-

ing that the board had erred in concluding that

Schwartz could not be disbarred in Louisiana.

The Court conducted an independent review

of the record to determine whether the alleged mis-

conduct was proven by clear and convincing evi-

dence. Based on its review, the Court concluded that

Schwartz had engaged in the unauthorized practice

of law. It also found that Cortigene had facili-

tated Schwartz’s unauthorized practice and failed to

report it to disciplinary authorities. The Court dis-

barred Cortigene, but concluded that the appropriate

sanction for Schwartz’s conduct if he were a member

of the state bar would be a three-year suspension.

The Court then answered in the affirmative what

it considered to be a res nova issue, that is, whether

it has the authority to impose discipline on a lawyer

not admitted to the bar of Louisiana. Pursuant to its

plenary power to define and regulate all facets of

the practice of law, the Court explained, “we have

the right to fashion and impose any sanction which

we find is necessary and appropriate to regulate the

practice of law and protect the citizens of this state.

This power is broad enough to encompass persons

not admitted to the bar who attempt to practice law

in this state.” Therefore, the Court concluded, “in

the exercise of our plenary authority, we may enjoin

a non-Louisiana lawyer from seeking the benefits of

a full or limited admission to practice in this state.”

Accordingly, the Court adjudged Schwartz

guilty of conduct that would warrant a three-year

suspension if he were a member of the Louisiana

bar. However, recognizing that he was not a mem-

ber of the state bar, the Court enjoined Schwartz for

a period of three years from seeking admission to

practice law in Louisiana on either a permanent or

temporary basis.

chaRacteR and fitneSS

Financial irresponsibility

In the Matter of the Application of T. Z.-A. O. for Admission to the Bar of Maryland,

441 Md. 65, 105 A.3d 492 (MD 2014)

In May 2012, T. Z.-A. O. (“Applicant”) filed an

application for admission to the Maryland Bar with

the State Board of Law Examiners, and in June 2012

the Board sent the application to the Character

Committee for the Fifth Appellate Circuit (“the

Committee”). Applicant passed the July 2012

Maryland bar exam; however, based on the results of

the Committee’s investigation, in June 2013 a three-

Page 60: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

63Litigation Update

member panel of the Committee held a hearing to

determine whether Applicant possessed the good

moral character and fitness necessary for admission

to the Maryland Bar. The panel found the following

facts:

• In May 1996, Applicant was arrested for

public indecency in Columbus, Ohio. He

pled guilty in July 1996 and was sentenced to

30 days in jail with 1 day credited toward the

sentence and the other 29 days suspended.

• In 2004 Applicant applied to and was

accepted at Tulane Law School. Question 28

on the law school application asked about

criminal charges and convictions. Applicant

answered in the negative and certified that

his answers were true.

• Applicant disclosed the 1996 arrest and

conviction when he applied to the Florida

Bar, and the Florida Board discovered that

Applicant had failed to disclose the 1996

matters to Tulane and contacted Applicant,

who then notified Tulane about his failure

to disclose.

• In May 2004, shortly after applying to law

school, Applicant filed a petition for Chapter

7 bankruptcy and at the hearing admitted

that his financial activities had been irre-

sponsible and included the use of multiple

credit cards when he was unemployed with

no means to pay the balances. In August

2004 $58,000 in debt was discharged.

• In August 2006, Applicant purchased a new

Honda vehicle. Applicant stated that he had

been planning to purchase a used car, but

that the sales personnel persuaded him to

test-drive new vehicles and prepared the car

loan application as well as additional con-

tracts and agreements related to the car pur-

chase. The loan application did not mention

the 2004 bankruptcy and falsely stated that

Applicant owned a home, made no rental

or mortgage payments, and earned $3,500 a

month. Applicant certified that all the infor-

mation on the application was true, correct,

and complete. Applicant later claimed that

the sales representative must have inserted

the false information about home owner-

ship and income into the car loan applica-

tion. Applicant knew that the interest rate

was 14.95% and that his monthly payments

would be $674.70 when he took possession

of the vehicle.

• In fall 2007, Applicant stopped making

monthly car payments because he claimed

that the contract contained irregularities;

however, he continued using the car and

did not turn it in until February 2008. At

that time there was a $19,000 arrearage on

the car loan. Applicant testified that he liti-

gated against the finance company and the

company forgave the outstanding arrearage.

• At the time of the hearing, Applicant was

self-employed and performed research and

writing for a law firm in Florida. In 2012

Applicant had earned $24,000, and from

January to June 2013 he had earned between

$18,000 and $19,000. Applicant admitted

that he had $220,000 in private and federal

student loan debt, that he was making only

minimum payments on the private loans,

and that the federal loans were in forbear-

ance or deferred.

The Committee concluded that Applicant had

not shown financial responsibility. He had continued

to accumulate debt with no plans to repay it. Since

Page 61: The Bar Examiner Volume 84, No. 1, March 2015 - National ... · PDF file2 The Bar Examiner, March 2015 T he mission of the National Conference of Bar Examiners is twofold. It

The Bar Examiner, March 201564

his bankruptcy, in which he had discharged $58,000

in debt, he had accumulated nearly $200,000 in stu-

dent and consumer loans and had discharged the

$19,000 vehicle loan. He also had not acknowledged

the untruthful information on his loan application,

and he had stated that his application to the Florida

Bar was denied because of illegal behavior and finan-

cial irresponsibility. The Committee recommended

that Applicant’s application be denied.

In December 2013, the Board conducted a

hearing and at its conclusion recommended that

Applicant be denied admission to the Maryland Bar,

finding that after his bankruptcy he had established

numerous consumer credit accounts and that he

had not taken his credit obligations seriously until

it appeared that they might keep him from being

admitted in Maryland. Applicant was 31 years old

and a college graduate when he entered law school,

was 32 and a law student when he signed the car

loan papers, was 33 when he applied to be admitted

in Florida, and was 38 when his Maryland applica-

tion was accepted. He seemed to treat these incidents

of financial irresponsibility as youthful indiscretions.

The Board found that “[h]e [had] shown no commit-

ment to honesty and financial responsibility.”

The matter was then reconsidered by the Court

of Appeals of Maryland, which under the Court’s

rules is “charged with the responsibility to conduct

an independent evaluation of the applicant’s moral

character based upon testimony and evidence sub-

mitted before the Committee and the Board.” The

Court noted that in bar admission cases failure to

honor financial obligations and lack of candor are

serious matters that do not reflect well upon an

applicant’s fitness to practice law, adding that abso-

lute candor is a requisite of admission in Maryland.

The Court again concluded that Applicant’s

“inability to honor financial obligations and to be

financially responsible, as well as (his) lack of can-

dor, reflect that he does not presently possess the

moral character and fitness necessary to practice

law” in Maryland. The Court stated that Applicant’s

character and fitness to practice were shown to be

questionable by his lack of candor in connection

with the 2006 car loan application containing false

financial information. The Court added that rather

than accepting full responsibility for this, he blamed

the car sales representative. Stating that both the

Committee and the Board had recommended denial

of admission, the Court again agreed, concluded that

Applicant had failed to meet the burden of proof,

and denied his application for admission to the

Maryland Bar. The Court also denied Applicant’s

Motion for Reconsideration of the Court’s April 2014

denial of Applicant’s application.

fRed p. paRkeR iii is Executive Director Emeritus of the Board of Law Examiners of the State of North Carolina.

jeSSica glad is Staff Attorney for the National Conference of Bar Examiners.