test systems software / fee controls peter chochula
Post on 19-Dec-2015
217 views
TRANSCRIPT
Test Systems Software / FEE Controls
Peter Chochula
Peter Chochula2
PTS Status
PTS v 2.0 Analysis and DBMS decoupled from system (easy to
upgrade now) System configuration via ASCII files Possibility to dump settings to new config files Loadable Maskbit and Testbit matrices Fully integrated bus Updated panels
…. And bugs fixed
Peter Chochula3
PTS Version 2.0 – Main CPVersion 2.0
Help available(to be extended…)
DBMS Integration
Status OverviewSimplified
Configuration
Peter Chochula4
PTS 2.0 – JTAG Integration
Supported Controllers: Corelis MVME 1149.1 with or without external multiplexer Corelis 100f (ISA) JTAG Technologies 3710 PCI (testbeams) KEJTAG v2.0
Automatic Controller Test
Peter Chochula5
PTS 2.0 – Supported Testbeam Setup
ReferenceReference
Tested Object
Scintillators
Peter Chochula6
PTS 2.0 – DAQ Software
3 planes1-10 chips / plane
Automatic Data Integrity Checks
Peter Chochula7
PTS 2.0 - New Debugging Tool - Data Analyser
Plugins
Run Conditions
Buffered Beam Profile
Data Frame Decoder
Event Display
Single Event Processing
Peter Chochula8
PTS 2.0 – A1 and BUS Manual Controls integrated with Pilot MCM (Beta)
Status of MCM JTAG Configuration
MCM Manual Control – JTAG Configuration
…Analog Pilot not yet fully integrated
Peter Chochula9
PTS 2.0 – Threshold Scans – New Data Format
New (Flexible)
Data Format
Root Interface recognizes the Data format
Peter Chochula10
PTS 2.0 DAC Sweep Ready for any BUS configuration
Using MB DACS or external device
Integration of external device is easy ( 1 VI only)
Peter Chochula11
LabView upgrade to v.6?
If yes then in all Institutes at the same time CERN can upgrade only as the last one
SPD Front-end and Readout ElectronicsSetup & Configuration
Based on talk given on ALICE-TB, January 2003 Please see also related document on ALICE DCS web (Documents -> FERO)
Peter Chochula13
ALICE online software hierarchy
DCS
TPC
FERO
Gas
LV
HV
SPD
FERO
LV
HV
DAQ/RC
TPCSPD
TRG
TPCSPD
HLT
ECS
…… …
(Source: S. Vascotto, TB presentation, October 2002)
Peter Chochula14
Partitioning of ALICE Online systems
DAQ/RC
PCA
DCS TRG
ECA
(Source: S. Vascotto, TB presentation, October 2002)
Partition A
DAQ/RC
PCA
DCS TRG
Partition A
Peter Chochula15
Example: The Design of SPD
Pilot MCMSensorReadout Chip Bus
Peter Chochula16
Summary: Alice FERO Architectures
FEROClass A
FEROClass B
Class CFERO
Class DFERO
Configuration Monitoring
Monitoring Monitoring
• DDL is used to configure FERO• Monitoring is based on different technology
• There are 2 options to configure FERO:•DDL based ( same as Class A)•Non-DDL (Ethernet, etc.)
• DDL is not involved in configuration • Configuration and monitoring are sharing the access path to FERO
Configuration
Configuration
Monitoring
Configuration Configuration
DDLDDL
Peter Chochula17
Controls Technologies
DCS interacts with devices via well defined interfaces
Hardware details are usually transparent to upper layers (Example: CAEN, ISEG)
Preferred communication technologies are OPC and DIMDevice Hardware
Process Management(PLC…)
Communications(OPC,DIM)
Supervision(SCADA)
Customization, FSM
Peter Chochula18
Concept of the Front-end Device (FED)
PCA
DAQ/RC DCS
FED CPU
FERO Hardware
DIM Server
DIM ClientDIM Client
PLC
LVPS
Profibus, JTAG, etc.
Additional monitoring path
DAQ Workstation (LDC)
DDL Sw
DDL Sw
DDL
FED
Peter Chochula19
SPD – FED Interface to DCS
DAQ
Data
Halfstave control
JTAG
Router
DD
L
SPDDATA
JTAG Return
Dedicated CPU(Workstation)
Memory
DIM
DCS - PVSS
VR Control, VR Status, I,V,Temp,
Time Critical tasks
Standard Interface
Private Software
SPD FED
Peter Chochula20
DIM Protocol
Service based protocol Client can subscribe to
service and define the update policy
Easy to implement on different platforms
DIM –custom protocol
NameServer S
ervice Info
Request S
ervice
Subscribe to service
Service Data
CommandsReg
iste
r ser
vice
s
Client
Source: C.Gaspar
Server
Peter Chochula21
Controls Hierarchy is Based on Functionality
FERO Hardware
FED
Configuration DU Monitoring DUTrigger status DU
DAQ/RC
DCS
PCA
Trigger
see C. Gaspar: Hierarchical Controls Configuration & Operation, published as a CERN JCOP framework document http://clara.home.cern.ch/clara/fw/FSMConfig.pdf
Configuration CU Monitoring CUTrigger status CUCommands
Status
Definition and implementation of Device Units is detector’s responsibility
CU – Control UnitDU – Device Unit
Peter Chochula22
Time Flow of FERO Configuration
PCA
DAQ/RC
DCS
1
2
3
FERO Hardware
FERO CPU
DCS
3
PCA
DAQ/RC
Definition and implementation of FSM is detector’s responsibility
Peter Chochula23
SPD Readout Layout
Router(s)
MXI-2
1 router services 6 halfstavesSPD contains 20 routers
PCI-MXIDCS
DAQMXI-VME
Peter Chochula24
Controlling the VME Crates – MXI Daisy-Chain
.
.
• only one PCI Controller needed• programming is easy – chain is transparent to SW• performance related questions
Peter Chochula25
Controlling the VME Crates – 2 PCI-MXI Bridges in one PC
.
.
• two PCI Controllers needed• programming still easy – (lookup table?)• performance – we could gain using parallel processes
Peter Chochula26
Controlling the VME Crates – 2 PCI-MXI Bridges in one PC
.
.
• two PCI Controllers and two Computers needed• programming more complicated on upper level• performance – probably the best
Peter Chochula27
Tasks Running on the Control Workstation
PVSS
DIM Servers
Local Monitoring
Can a single machine handle this load?Do we need to separate PVSS from local control?Do we need to separate the two sides of SPD?Do we even need 3 computers….?
Answer will be obtained from prototypes
FAST – Time Critical tasks
“Slow”
Peter Chochula28
SPD needs additional processing configuration of data
X X
• we need to develop a procedure for fast detection of bus status• configuration data must be correctly formatted
Peter Chochula29
Internal Chip Problems Can Affect the Configuration Strategy
X X X X
Peter Chochula30
Internal Chip Problems Can Affect the Configuration Strategy
X X X X•We need to develop a mechanism for problems recovery
•This should not be implemented as a patch in the configuration routine!
•Problems should be described in a “recipe” which is loaded from configuration database together with configuration data
Peter Chochula31
Detector Calibration – Standard Approach
PCA
DCSDAQ/RC
Load Thresholds and Test Patterns
Log Data
Run DAQ
Analyze Data
Prepare Configuration Data
OFFLINE
ONLINE
Peter Chochula32
Detector Calibration – Standard Approach
Synchronization between DAQ and DCS via PCA will add some overhead Conservative estimate ~ 7680 synchronization
cycles, will add about 2 (or even more) hours dead time…
We need a local calibration procedure SPD will be put into ignored state during the calibration We need to define FSM and DCS recipe
PCA
DCSDAQ/RC
Peter Chochula33
…But…
This was still not the bad message
Peter Chochula34
Software/hardware overhead
Loading of a single chip needs ~ 300 ms Out of this time more than 99% is the communication
overhead This time seems to be negligible …but….
The ALICE1 chip is really complicated and big Remember, when we started, we needed some 2 hours to
scan a single chip. This has been reduced to some 5 minutes using several tricks
Time needed to scan a bus is still ~45 (or 15 with less statistics) minutes and cannot be reduced (the amount of data is bigger by an order of magnitude)
Peter Chochula35
Detector Calibration
We cannot simply implement the present procedures Estimated time for scan is ~30 hours, with 8 hours of
JTAG activity
Ways to reduce the needed time: Run scans in parallel
…but – only one Router can be addresses at a time Use the built in macro option in KE-JTAG controller Implement a part of scanning procedures in Router’s
hardware
Peter Chochula36
SEU Monitoring
Standard approach: Write the configuration data into the Alice1 chips Compare output with previously written configuration
…But… Analyzing routines must understand the way how the
configuration is written (bus configuration) Part of data will be lost
Due to the nature of Alice1 chips (stuck LSB) Due to the tricks used to load chips with internal problems
Peter Chochula37
DCS Architecture: Data Flow (Configration & Logging)
Configuration DB
Archive
Archive
Archive
Archive
Subsystems Hardware
Conditions DB
PVSS Configuration
DCS Recipes
FERO Config.
Device Config.
Peter Chochula38
Required tasks
Definition of configuration data Definition of monitoring limits (recipes) Definition of data subset written to Conditions DB
Development of offline analysis tools
Peter Chochula39
A few recommendations
Base the development of PTS reverse engineering Use Windows XP and if possible Visual
Studio.NET as development platform (at least for final product testing)
Use MySQL for database prototyping Restrict Database programming to standard SQL
We will probably change the underlying database for final system (ORACLE?)
Peter Chochula40
Conclusions
PTS 2.0 is available FERO configuration & monitoring needs a lot of
work