new lp
TRANSCRIPT
LP Modelling
36 HYDROCARBON ASIA, MAY/JUNE 2004 Visit our website at: http://www.safan.com
The New LPNew technology is changing the face of Linear Programming. The New LP is non-linear withoutcompromise, reliably convergent, predictable, and no longer an isolated application but an integralpart of the business process. True cut-point optimisation is a convenient reality. With modernsoftware tools, the New LP is easier to learn and the results applicable to a wider audience.It is available to and used by more people within the refinery. In this paper, the virtues of the21st century LP application will be described, and its benefits discussed.
LP Modelling
H istorical perspectiveTo appreciate the significanttechnical advances nowavailable in 21st century LP
systems, it is important to understandhow refinery LP technology hasevolved. In 1947, George Danzig, aprofessor at Stanford University, de-veloped the first linear programmingalgorithm. It was used on computersof that era to help solve troop supplyproblems during the Second WorldWar. Since then, LP technology hasbeen used in many different indus-tries to solve myriad problems. Itsuse in the petroleum refining arena,however, is arguably responsible forthe greatest advancements in this ex-traordinary technology.
Early LPs were very difficult to cre-ate and interpret. Their input “decks”were literally boxes of punch cards,each card specifying a particularmatrix element. LP execution timestypically ran into hours. In the early1960s, Larry Haverly developed aconcept that revolutionised LP mod-elling. Shortly thereafter, JohnBonner, Joe Moore, and others fol-lowed with similar approaches.Haverly developed a computer lan-guage, appropriately named MaGen,which allowed for the automatic gen-eration of an LP matrix from a logicalarray of recognisable data tables.MaGen was perhaps the first appli-cation of database technology in the
computer industry.A problem recognised in the
infancy of LP was that LP modelswere indeed linear, while the opera-tions they were meant to representwere often non-linear. This was nomore evident than in an LP situationcommonly referred to as “The Pool-ing Problem”. Since it was notpossible to specify the quality of un-finished streams resulting from thejoining or “pooling” of two or moreother unfinished streams, modelswere developed to allow all unfin-ished streams to blend directly toevery finished product for which theywere a potential component. This ledto a situation called “overoptimisation”,where LP results were often too opti-mistic and entirely unrealistic.
In the late 1970s, the LP commu-nity overcame the Pooling Problemby developing advanced recursivetechniques. Programmers could nowconstruct models with the qualitiesof these intermediate pools specifiedas variables - to be updated throughsuccessive solutions to the LP model,in a manner, which drives the prob-lem to convergence. As a result ofthis advancement, LP models couldfinally be constructed to representthe realistic behaviour of refinery op-erations. Even so, LP models requiredexorbitant amounts of computer re-sources and long execution times.
With the development of powerful
personal computers in the 1980s, LPtechnology truly became the planningtool of choice throughout the refiningindustry. No longer was linear pro-gramming restricted to only companieswith access to ample computer main-frame resources. Convenient PCspreadsheet technology became thepreferred environment for the mainte-nance and storage of LP model data— as well as the recipient and re-porter of solution results. The PCenvironment also opened the door todevelopment of friendly user inter-faces, which made LPs much easierto construct, maintain, execute, in-terpret, and manage.
Throughout the 1980s, researcherscontinued to develop techniques,which would allow LP models to stillbetter meet other nonlinear demandsof real refining operations. With theintroduction of the US ReformulatedGasoline (RFG) regulations in the early1990s, the desire to develop such newtechniques became a requirement.
In 1994, Dean Trierwiler adopteda technique earlier developed by BP’sDr. Roger Main, which allowed forthe determination of pooled streamproperties that were themselves func-tions of other stream properties. Withthis technique, which Trierwilercalled “Adherent Recursion,” thecomplex RFG emission propertiescould be modelled as nonlinear func-tions of linearly determined gasoline
HYDROCARBON ASIA, MAY/JUNE 2004 37
properties.While LP technology was advanc-
ing, process and distillationsimulation technology also pro-gressed. Such simulators were beingused with increasing frequency toprepare LP model data, but whilethe simulators were able to considerthe real, nonlinear thermodynamicrelationships involved in the process-ing of hydrocarbons, only linearizedapproximations could be passed ontoan LP. The LP modelling communitysought to integrate these technolo-gies better and further improve theaccuracy and application of theirmodels.
In the late 1990s, the Adherent Re-cursion technique was further refinedand applied to process unit model-ling (which was Dr. Roger Main’soriginally intended use for the tech-nique). Links were developed toembed simulators into the recursivelife of the LP. These developmentshave led to the direct LP optimisationof such nonlinear operating variablesas unit severities and distillation cut-points.
The 21st Century LPWhile the benefits of LP simulations
to the refining industry were unques-tioned, the LP systems in use at theend of the 1990s were still very cum-bersome. Although vastly improvedover systems of only a few years ear-lier, they still required dedicated,well-trained individuals to managetheir data, control their operation,and interpret their results.
Most development efforts in recentyears have focused on reducing theskill levels required to operate LPsthrough still further simplification ofthe procedures, while making avail-able tools and software integrationavenues to increase personal produc-tivity. Perhaps the most significant ofthese undertakings is Haverly Sys-tem’s development of its G4Enterprise Planning System. G4 hasfully embraced modern relational da-
tabase technology to provide a com-plete LP system that is both easy tooperate and technically rich.
Figure 1 depicts the typical refin-ery LP user’s activities prior to theadvent of new LP systems, such asG4. Crude oil and processing datatypically first had to be analysed andlinearized through the use of offlinesimulators, before it could be struc-tured into a form acceptable to anLP. Other data, such as informationrelated to purchases, sales, transports,and inventory, had to be restructured
Figure 1
Figure 2
into LP acceptable forms, as well.The user then had to separatelyspecify the parameters of the LP: itsscope (periods and locations),blended product specifications, re-porting and LP execution settings,etc. Once all parameters were set,the user could then launch a run andsubsequently interrogate its results.Finally, the user was responsible forseparately managing his LP cases in amanner to best provide him with allthe information he needed to com-plete his study.
LP Modelling
38 HYDROCARBON ASIA, MAY/JUNE 2004 Visit our website at: http://www.safan.com
Figure 2 shows the refinery LP us-er’s activities under the new LPsystems environment. The functionspreviously mentioned still largely ex-ist, but are performed away from theuser’s direct area of responsibility.The user interacts only with friendlyinterface software, which resides atopa highly sophisticated relational da-tabase. In addition, the relationaldatabase provides for several morefunctions useful in LP modelling, suchas: intrinsic data checking, ready link-age to other software systems(information systems, short-termschedulers, etc), and vivid graphicalutilities.
Example Case StudyThe New LP allows the user to
analyse refinery operations with moredetail than ever before. One exam-ple is the use of a process simulatorinterface. Historically, process yieldswere represented using linear baseyields and “shift” vectors. The baseyields were calibrated to a specificbase feed. For an FCC unit, the LPmodel might incorporate 2 or morebase yield vectors, say a high conver-sion yield and a low conversion yield.
A shift vector is used to reflect yielddifferences when the actual feed dif-fers from the base feed. There aremany types of shift vector applica-tions; in FCC operations, someexamples of feed shift vectors mightbe UOPK, nitrogen, refractive in-
dex, density, hydrotreated feed, andconversion. As a simple example,say the base yields are calibrated to afeed that has a UOP K-factor of 12.0.If the LP model runs a feed with aUOP K-factor of 11.5 (therefore lesscrackable), one would expect lessgasoline production. The shift vec-tor captures the product yielddifference from the 12.0 base feedand the 11.5 actual feed.
In the historical sense, both baseyields and shift vectors were linearrepresentations for process yields. Inan operating refinery, process opera-tions are not always linear, but wehave historically used linear vectorsto approximate the operations.
In the example below, the modelonly has two base yield vectors-ahigh and low conversion vector-andfor simplicity, excludes quality shiftvectors.
If the model runs 63% in highconversion of 1000°(F) RiserTemperature and 37% in lowconversion of 960°(F) Riser Tempera-ture, the resulting linear solution is anapproximation of the FCC yields whenthe Riser Temperature is 985°(F).
Although this methodology can beused, it oversimplifies the true FCCbehaviour because of its linear rep-resentation. As an example, the yieldof FCC naphtha is not linear. In fact,at some point along the curve, theproduction of naphtha actually de-creases, marking the point of
overcracking. The graph in Figure 3clearly shows that this in a non-linearrelationship.
The linear approximation predictsan FCC naphtha yield of 58.94 per-cent at a riser temperature of 985°(F),which is the linear interpolation be-tween 960° (F) and 1000°(F). Theabove non-linear curve was devel-oped with an FCC simulator that hasbeen calibrated to a generic opera-tion. At the riser temperature of985°(F), the FCC simulator predicts anaphtha yield of 59.28%
Simulating FCC operations is criti-cal to the overall success of accuratelypredicting refinery operations. Withthe new LP technology available, anFCC process simulator was developedin a spreadsheet format to link directlyto the LP model. The goal of the simu-lator is to characterise the feedaccurately and to predict the yieldsunder various operating conditions.
Some significant input parametersto the FCCU simulator include:Feed Stream RatesUnit Contact TimesUnit Technology OptionsStraight Run Feed Quality DataFeed Injection TechnologyHydrotreated Feed Quality DataReaction System EfficienciesFeed Hydrotreater Reactor DataCatalyst Kinetic & SelectivityFactorsCatalyst Addition Rates, Quality andCosts
Activity 63% 37% 100%High Conv Low Conv Linear Solution Simulator Solution Linear vsLinear Yield Linear Yield Vol% Vol% Simulator Delta
C1-C2’s 0.0388 0.0265 0.0342 0.0331 -0.0011
C3’S 0.1265 0.0984 0.1160 0.1154 -0.0006
C4’S 0.1956 0.1671 0.1849 0.1849 0.0000
FCCU Naphtha 0.5905 0.5875 0.5894 0.5928 0.0034
FCC Lt Cycle Oil 0.1478 0.1908 0.1639 0.1632 -0.0007
FCC Slurry 0.0527 0.0684 0.0586 0.0583 -0.0003
Riser 1000 960 985 985 0
Conversion 80.0% 74.0% 77.8% 77.8% 0.0%
LP Modelling
40 HYDROCARBON ASIA, MAY/JUNE 2004 Visit our website at: http://www.safan.com
Unit Kinetic Options
Unit Operating Conditions
Unit Kinetic VariablesMiscellaneous Feed Quality Data
The FCC simulator performs de-tailed heat & material balancecalculations and sizing calculations.This tool can be effectively used tooptimise existing operations as wellas analyse capital improvement ideas.
In the above FCC example, the dif-ference in naphtha productionbetween the simulator and a linearrepresentation is 0.34 percent. Thequestion to ask is whether or not thelinear representation is close enough.The answer is, it depends. As withmost LP work, the model is geared tothe level of detail sufficient to answerthe question at hand. A tolerance of0.34% naphtha yield for a regionalsupply/demand model is certainly ac-ceptable; however, if consideringreplacement of old FCC nozzles withadvanced nozzle technology, a0.34% difference might not be ac-ceptable. This small tolerance couldtranslate to millions of dollars peryear, and could be the differencebetween accepting or rejecting a pro-posed project.
The accuracy provided by the FCCsimulator coupled to the LP model isfar superior to the old linear approxi-mation. Additionally, accuratesimulation of FCC operations within
Figure 3
the context of refinery LP optimisationallows the user to understand the im-pact of the FCC on the overall refinery
complex. For example, revampingan FCC likely will impact refineryfeedstock selections and refinery pro-duction. With successive LP runsintegrated with a process simulator,the ability to analyse the complexinteractions between FCC operationsand overall refinery economics isgreatly enhanced: changing nozzletechnologies, catalyst impacts, feedpreheat impact, riser temperatureimpact, hydrotreated versus straightrun feed, recycle streams, and the listgoes on.
The Ten Commandmentsof LP
In conclusion, users of the New LPmust continue to recognise the abili-ties and limitations of LP, as well asthe responsibilities they have to prop-erly apply this technology. Theseresponsibilities are best summed upin the following Ten Commandmentsof LP:
Thou shalt consider the LP modelan extension of thyself.
The LP is only a tool. The responsi-bility for how the tool is used residessolely with its operator.
Thou shalt accept that all LPmodels are premise laden and as-sumption driven.
All LP models are built on the bi-ases and assumptions of theircreators. Users must recognise thisfact, and ensure that these condi-tions do not interfere with, but indeedsupport, the use of the LP for eachstudy.
Thou shalt accept that an LP Modelis only as good as the data on whichit is based.
Nowhere is the old adage “Gar-bage In ... Garbage Out” moreapplicable than in LP modelling.
Thou shalt create the LP model tobe an accurate representation of re-finery operations.
To best optimise a refinery’s op-eration, the LP must be given acomplete, accurate description of therefinery’s processing capabilities. Andthese capabilities must be modelledin a manner consistent with their re-alistic impact on operations.
Thou shalt assure that the LP modelis weight balanced.
Mass can neither be created nordestroyed. Given the opportunity,an LP will always try to make some-thing out of nothing, or destroy whatit does not like.
Thou shalt examine the LP Modelfor local optimals.
Most new LP systems have mecha-nisms that work to avoid localoptimums. Nevertheless, encounter-ing a local optimal is always atheoretical possibility in nonlinearmodels. Users must be familiarenough with their models to recog-nise possible local optimalities andtest for their existence.
Thou shalt accept the sensitivity ofan LP solution.
An LP solution is based upon in-cremental economics, and is bynature driven to extremes. A smallchange in a model constraint or costcan have a substantial effect on itssolution.
Thou shalt not expect that the LPwill improve its data’s accuracy.
It is not reasonable to force a modelto solve to tolerances tighter then thoseupon which its data are defined.
Thou shalt accept that an LP does
LP Modelling
42 HYDROCARBON ASIA, MAY/JUNE 2004 Visit our website at: http://www.safan.com
not consider time.All values reported by an LP are
averages over the time period(s) uponwhich it is defined. In all likelihood,products will never be blended to therecipes they predict, and feedstockswill never be charged in the ratiosthey assume.
Thou shalt never base a decisionsolely on an LP solution.
Hydrocarbon Asia thanks L. Dean Trierwiler of Haverly Systems Inc
and Vince B. DiVita of Jacobs Consultancy Inc. for contributing this
paper which was presented at the ARTC Computing Conference in
Singapore on 20 - 22 October 2003.
L. Dean Trierwiler is the Business and Technical Manager at Haverly
Systems, Inc., Houston, Texas. He joined Haverly in 1990, and has
since worked in the management, support, and application of Haverly’s
planning, scheduling, and crude assay software tools. For the 16 years
prior to joining Haverly, he held various planning, economic,
engineering, and software development positions within the refining
industry (with CITGO, Chevron, and UNOCAL). Mr Trierwiler is a
recognised expert in the area of Linear Programming Recursion, and
was instrumental in the development of both the Distributive and
Adherent Recursion techniques. He holds a B.S. degree in Mechanical
Engineering from Washington State University.
Vincent B. DiVita has 15 years experience in the chemical and
petroleum industries with emphasis in LP refinery simulations and
economic analysis. As a Group Manager of Jacobs Consultancy, his
responsibilities include refinery optimisation, project management,
strategic planning, and feasibility studies. He is also responsible for
maintaining, developing and managing the LP model that is used to
analyse a broad range of complex subjects for single client and multi-
client studies. Before joining Jacobs Consultancy, Mr. DiVita was a
Senior Process Engineer with Rhône Poulenc at their Houston, Texas
sulphuric acid/aluminium sulphate plant. He has also been employed
as an Associate Engineer for Shell Oil Company and as a consultant
with Purvin & Gertz. Mr. DiVita holds a B.S. degree in Chemical
Engineering from Texas A&M University and an M.B.A. from Houston
Baptist University. He is a registered Professional Engineer (Texas).
HA Enquiry Number 5/6-08
One should never answer a ques-tion with a statement beginning with“because the LP says ...”. LPs saynothing. They are only one of thetools a planner employs to assist himin making a decision. LP solutionsmust be collaborated by wisdom andexperience.