wipac monthly - june 2017

25
Page 1 WIPAC MONTHLY The Monthly Update from Water Industry Process Automation & Control www.wipac.org.uk Issue 6/2017 - June 2017

Upload: water-industry-process-automation-control

Post on 22-Jan-2018

301 views

Category:

Engineering


0 download

TRANSCRIPT

Page 2: WIPAC Monthly - June 2017

Page 2

In this Issue

From the Editor.................................................................................................................... 3

Industry News..................................................................................................................... 4 - 13

Highlights of the news of the month from the global water industry centred around the successes of a few of the

companies in the global market.

Empirical Modelling in the Water Industry - A primer.................................................... 14-18

In this month’s feature article, David Brydon of Blackwell Consulting, explains the different types of empirical models

that are used in the Water Industry and their applications including case studies of where there use has provided

significant benefit

Lime Storage Silos - Disasters waiting to happen............................................................... 19-20

In this article by Hycontrol, we learn the potential danger of lime storage silos and how the installation of safety

systems allowing ground level confirmation of the operating state of the silo will provide assurance that the silo is safe

Activated Sludge Plant Optimisation & Aeration Efficiency................................................ 21-23

In this application note from Xylem shows the use of a trial site to examine the affects of an industrial effluent on the

alpha factors when changing from Jet Aeration to Fine Bubble Diffusers. The article presents an innovative design of

optical dissolved oxygen probes.

Workshops, Conferences & Seminars................................................................................... 24-25

The highlights of the conferences and workshops in the coming months

WIPAC Monthly is a publication of the Water Industry Process Automation & Control Group. It is produced by the group

manager and WIPAC Monthly Editor, Oliver Grievson. This is a free publication for the benefit of the Water Industry and please

feel free to distribute to any who you may feel benefit.

All enquires about WIPAC Monthly, including those who want to publish news or articles within these pages, should be directed

to the publications editor, Oliver Grievson at [email protected]

Page 3: WIPAC Monthly - June 2017

Page 3

From the Editor

It seems like data has become the subject of the moment as report this month by engineering consultancies Arcadis and Black & Veatch have both highlighted data as key factors in their annual reports on the water industry. We also have

CH2M looking at it on behalf of WE&RF and in the UK the Economic Regulator, OFWAT have also highlighted data as a key subject of interest. It has been a subject of the water industry for a number of years and it forms the basis of the Smart Water Industry. As a whole the thoughts have been around DRIP (Data Rich Information Poor), there is of course the phenomenon where the data isn’t rich and you have to look at the quality of your information sources. It is very easy to be fooled by what your instruments are saying and this is why we as an industry have to heed the example taken by our colleagues in analytical laboratories. For those of you who have spent time in the “labs” you know that AQC and calibration is something that is ingrained deeply very early on.

What was interesting in the OFWAT report was that they had a number of expectations and a number of key areas of for which data should be used by the Water Industry. These included:

• Data Innovation• Customer Empowerment• Collaborative Working• Data Strategy• Data Knowledge• Data Quality • Data Security

Now as an industry we are starting to see pockets of this across the world. OFWAT particularly mentioned Adivzzo which is a firm who look at customer engagement and gamification, if you ever get the chance to see Julian Lancha, who heads up the company then you’ll understand what its all about. On top of this we’ve seen companies like EPAL (Portugal, DC Water (USA) and UnityWater (Australia) actively engage with their customers to manage water use and to inform the customer with what is going on. We hear of UnityWater’s success in keeping bills lower for the customer which is after all what is wanted from a customer and financial regulator point of view. So for the use of data around the world we can tick Innovation, we can tick empowerment, we can tick collaborative working. Data security is probably one that we can never tick as there are always new threats and new ways of it, having heard presentations by various security experts over the past few years it is truly scary what can be done but as a world it does seem that we are getting more and more security conscious and the teams of specialists within companies who keep us IT safe face a constant battle which the vast majority of us don’t really realise what happens in the background. Like security quality is also a constant battle but it is a battle that we know.

So data is certainly a focus....we are being told it from a number of different angles and we are starting to see huge developments being made but in reality it has always been there. The article in this issue from David Brydon about empirical models of course uses data and he has been working with it for many years. The work in the article by Welsh Water on modelling bacterial quality in reservoirs is also heavily data dependent as well as all of the innovations that universities like Imperial College and the University of Exeter have been looking at on Hydroinfomatics. These are all areas that are data dependent and uses the data that is gathered to turn it into useful information.

All of the reports that have come out this month are right the data is a rich “gold mine” that is only starting to be tapped. It is not going to solve a problem that is say caused by a 20% growth in an area that requires a capital investment but it might tell you when that investment needs to be made and what is needed to satisfy the need. It seems one of the future areas of the Water Industry will most assuredly be “All about the data.....”

Have a good month

Oliver

Page 4: WIPAC Monthly - June 2017

OTT issues Flood Warning

British Water seeks case studies for Data and Analytics library

British Water’s Data & Analytics Focus Group is calling on professionals from the water and utilities sector to contribute case studies and unsolved challenges. A searchable, universally accessible digital library is being created by the group to share examples of best practice and encourage collaboration.

British Water technical director Marta Perez said, “We would like as many case studies as possible from within the water industry and from other relevant sectors. Data is where the big efficiency gains in water can be made, but there is still a lot for utilities, large water users, academics and practitioners across the supply chain to learn.”

The Data & Analytics Focus Group brings together experts on data capturing, processing and usage from across the industry and is open to all British Water members. The application stories will feed into a code of practice currently under development by the Group.

Joe Roebuck is convenor of the Data & Analytics Focus Group and business analytics director at SEAMS. He says the data and analytics library, which will be hosted on British Water’s website, should include examples from all levels and areas of business and operations.

Broad remit

“The remit is very broad, from the data capture that feeds into strategy planning for the five-year price reviews UK utilities have to produce, down to where in the network monitoring devices are placed.

“Examples could come from asset management, customer service, process efficiency, leakage, water quality, cyber security and health & safety. We are also interested to find best practice in the visualisation and communication of data – how to match data display and dashboards to different users.”

Roebuck says the group also would like to hear about the outstanding and emerging issues facing utilities and large water users on the data and analytics front, as well as completed projects.

“What are the challenges facing end-users?” he asks, and “what do they need from the supply chain?

“Better understanding and huge efficiencies can be achieved when the correct data is gathered and analysed appropriately. A rich bank of application stories and industry challenges will help the group develop a robust code of practice.”

Submissions should be made on the specially designed template, which can be downloaded here or requested directly by emailing [email protected]. The deadline for the first wave of submissions is 31st July 2017.

The next meeting of the British Water Data & Analytics Focus Group is 7 September 2017.

Nigel Grimsley, Managing Director of OTT Hydrometry is urging those responsible for flood prevention and management to take action this summer to ensure they are protected before the arrival of severe weather. “Saturated soil and blocked channels contribute to the severity of flooding, but in many cases it is the location and intensity of rainfall that contributes to the speed with which flooding occurs. Early warning of intense rainfall events or blocked channels is therefore essential. Recent advances in technology mean that it is now relatively simple and low cost to set up advanced warning systems so that appropriate measures can be taken quickly and effectively,” he says.

“The early implementation of prevention and mitigation measures provides essential protection for valuable assets. For example, on a small scale, we have installed water level monitors in village streams that are prone to flooding during intense rain or when channels become blocked. On a larger scale, we have supplied 153 monitoring stations for the Harris County Flood Warning System in Texas, where flooding represents a major threat to life and property, so early warning of intense rainfall and rising water levels provide opportunities for mitigation and where necessary, evacuation.”

OTT’s latest flood warning systems provide easy access to monitoring data from PCs and smartphones. However, crucially, they are also able to automatically issue text or email alarms when rainfall or water levels approach dangerous levels, so that timely action can be taken.

Page 4

Industry News

Page 5: WIPAC Monthly - June 2017

The Water Environment & Reuse Foundation recently awarded a contract to CH2M for research on Designing Sensor Networks and Locations on an Urban Sewershed Scale. The Internet of Things (IoT) is driving a revolution of sensing technologies and communication platforms, spurring the development of advanced sensors that will provide access to new data streams for the wastewater industry. The research will provide a basic roadmap for the application of remote sensor technology to address the challenges facing wastewater utilities.

The overall goal of the research is to determine the current capabilities and state of designing and implementing sensor networks in urban sewersheds. The research will define the most pressing issues wastewater utilities face including underground storage to minimize wet weather overflows, flow to pump optimization, overflow and infiltration/inflow issues, water quality monitoring, and installation of early warning systems. The research team, which includes Clean Water Services, will design an online survey and conduct interviews to yield quantifiable indicators of the current and future state from organizations focused on smart water systems.

Ultimately, this research will provide a clear understanding of big data analytics and its benefits to water and wastewater utilities. The results of the project will provide utilities with new information that will help them make better informed decisions, reduce operating costs, and identify cost-effective, focused capital replacement. The researchers will include a number of case studies that illustrate capital deferment, targeted mediation, requirements management, and frameworks for implementation. Utilizing IoT sensing technology will help operators understand how to use real-time data to efficiently manage assets, proactively flag risk, and avoid failure when possible.

CH2M is a renowned technology and applied research consulting firm spanning all aspects of water resources and smart cities, including urban planning and utilities, with a special interest in the creation and provision of valued information to users. Clean Water Services, a public utility serving urban Washington County, Oregon treats 60 million gallons of wastewater per day.

This project is expected to be completed in 2018.

Research To Clarify The Benefits Of Big Data Analytics To Water Utilities

MUS uses Robotics Process Automation for safe dig planningMorrison Utility Services (MUS) is using Robotics Process Automation (RPA) technology to deliver new safety, quality assurance and efficiency levels to its utility plan (‘safe dig pack’) preparation processes.

The company produces more than 150,000 safe dig packs a year, used to establish the location of underground assets - electricity cables, gas/water pipes, telecoms cables, etc - and ensure appropriate planning and safety precautions are taken before excavation work takes place.

The solution deploys software robots that mimic defined human actions to retrieve asset location information and handle high volume repeatable tasks and its introduction is delivering a number of benefits to the MUS safe dig pack process including;

• Enhanced safe dig pack quality and consistency (up from 99.88% to 100%)• Improved transactional processing efficiency• Reduced response time for delivering requests - faster, more accurate throughput with potential to cut data entry costs by up to 70%

Andy Carter, MUS director of Business Process Improvement, said: “An estimated 60,000 underground cable strikes occur in the UK each year, each with the potential to cause a serious injury or fatality, as well as the significant financial impact of associated damages and compensation costs. With safety at the heart of the process, our safe dig packs have to contain zero defects and are often required at short notice to meet business demand. Without this information, our teams in the field are unable to proceed with their work – leading to aborted work costs and the potential for customer inconvenience and financial penalties.”Carter added: “The RPA solution is both flexible and easily configurable, with none of the complex coding and formulas that other automation techniques often require, providing the scope for us to deliver core processes more efficiently, accurately and cost-effectively. To date, RPA has demonstrated the capacity to produce an average of 800 packs per day, with the potential for more if required – a significant increase on the previous daily average of just under 600 safe dig packs produced by human hand.”

The RPA solution is being piloted as part of a ten-year strategic business process outsourcing (BPO) partnership with Sopra Steria, a leader in digital transformation.

Page 5

Page 6: WIPAC Monthly - June 2017

Latest Strategic Directions Report from Black & Veatch examines role of data analytics and true cost of delivery

Customer education, infrastructure modernization and the use of data analytics will be key tools to overcoming the water industry’s perennial challenges posed by aging infrastructure. A combination of investment and new business process approaches will also be critical to closing the gap between costs and consumer expectation, according to the Black & Veatch 2017 Strategic Directions: Water Industry Report. The report, released in early June also addresses the increasing focus on sustainability, as well as the equally diverse strategies used by industry leaders to achieve it.

“Sustainability has different meanings to different segments of the industry. We pursue sustainable water supplies that can serve residents for decades to come. We look for ways to become more economically sustainable by balancing system and user needs with available capital. At the same time, we pursue a kind of social sustainability by engaging consumers as full partners in the pursuit of a supply that’s safe and resilient against weather events and long-term climate change.”

Cindy Wallis-Lage, President of Black & Veatch’s water business

Maintaining or expanding asset life was named the most significant sustainability issue for water utilities. Survey respondents, however, are showing significant interest in uniting data from once- siloed systems to increase operational efficiencies and inform smarter asset management.

“Data analytics provide new levels of system intelligence that can address many of the problems hampering sustainable water supplies. Smart meters and new software-based management tools enable us to turn all that data into understandable, useful insights that can address everything from water safety, asset performance and leak detection, to integrated planning and energy efficiency.”

Mike Orth, Executive Managing Director for the Americas in Black & Veatch’s water business

The report finds that financial challenges associated with sustainable systems have shifted, with fewer providers selecting finance-driven topics as their top issues on the path to sustainability. This may reflect growing confidence in funding from two important channels: the Water Infrastructure Finance and Innovation Act (WIFIA) and greater confidence that government leaders and customers may be more prepared to accept rate increases as a means to pay for critical improvements.

“Sustainable and resilient systems will depend on industry leaders who can both collaborate and innovate as the water sector attempts to modernize its assets, optimize existing resources and convince customers that upgrades are important,” Wallis-Lage said. The report also examines other issues key to the industry, such as physical security and cybersecurity, trends in enterprise asset management and financing strategies.

“Proactive, two-way engagement can help convey water’s true value to the community. This deliberate focus on the customer experience can change cost and water quality perceptions and help secure the rate increases needed to upgrade aging infrastructure.”

Ralph Eberts, Executive Managing Director for Black & Veatch management consulting

Other key findings include:

• Nearly half of survey respondents (47 percent) indicated that key stakeholders and the public understand the need for proposed rate increases, but still want water providers to “do more with less.”

• Despite increased public scrutiny, when asked about lead and copper corrosion, 55 percent of water utilities indicated it has not been an issue in their distribution systems.

• The report survey found that integrated planning trends higher among larger communities; the larger the population, the greater the rate of adoption. Of the respondents who already use the approach, 49 percent are from communities of 2 million people or more, and the acceptance rate drops as the community shrinks in size.

• More than 30 percent of utilities have indicated plans to implement advanced operational technologies such as advanced metering infrastructure and enterprise asset management.

• Nearly 40 percent of respondents said data analytics figured into their processes but not operationally; another 20 percent said data analytics weren’t part of their current processes but figured into strategic planning.

This year’s Black & Veatch strategic directions report can be downloaded by clicking here

Page 6

Page 7: WIPAC Monthly - June 2017

DC Water’s new open data portal shows why your toilet is floodingThe Washington utility launches a new portal to automate customer requests for information about sewage overflows, water main breaks and other issues affecting service.

When raw sewage is spewing from a manhole, open data is probably the last thing people might be thinking about. Yet, it is now the key communication tool for DC Water, Washington D.C.’s utility service, which has launched an open data portal (click here) for better communication with residents.

After a soft launch in early May, DC Water announced the portal’s official launch on Thursday. It debuted with five data sets that map sewage overflows, water main breaks, automated meter replacements, active fire hydrants and capital improvement projects — like the constructions of water mains and storm water pumping stations.

DC Water CIO Thomas Kuczynski said the intent of the portal is not only to increase transparency with customers, but also to create a data hub that residents could visit for information on emergency incidents like flooding when a main breaks, or during sewage spills.

The portal also acts as a kind of automated research tool for the community, the utility’s outside regulators and companies assessing the city’s water grid. In the long term, this is expected to produce a sizable decline in phone calls for staff delivering updates and information.

“We were looking to solve a strategic problem of sharing data, but we were also looking to have a flexible and robust piece of IT infrastructure that was easy for the public to use,” Kuczynski said.

Whenever community debates arise around water rate hikes, the common question residents ask is what they’re receiving in return for their money. If new infrastructure is raising water prices, Kuczynski said, people can see information about every project.

The utility service came under fire after aging meters started producing inflated water bills, according to WRC-TV, the local NBC affiliate, and this set in motion a massive effort to replace meters with modern technology. To ease anxieties, the data portal pinpoints when each neighbourhood is scheduled for replacements.

“Those particular [meter replacements and capital improvement] projects are the ones that generate a lot of interest from the public, and a lot questions from the District in terms of what’s going on in the neighbourhood, when do certain things occur, or what happened in the past and the likelihood of it happening again,” Hoffman said. “Now there is no need for a customer to call us when we are going to do their meter change.”

Data for the fire hydrants is updated daily. Updates for capital improvement projects happen monthly, while the rest of the data sets receive weekly updates.

“It’s closer to real time than we’ve ever been able to produce for this data, and downloadable, so if you want to analyze is offline you can do that,” Kuczynski said.

In the future Kuczynski said DC Water intends refresh the portal with new data sets. There is a plan to include the utility’s lead service line map, a tool that displays which customer water pipes were made out of lead. When such pipes corrode the lead can enter the drinking water and have harmful health effects. Other additions might include budget data, water quality complaints and details about service calls.

A map of sewer overflows in Washington D.C., between 2014 and 2017. (DC Water)

i2O Signs Historic Agreement With GDH Water In Chinai2O announced recently that it has signed an agreement with GDH Water in China. Under the agreement, i2O will supply loggers to GDH Water and work on a wider collaboration to make i2O’s smart network solutions available to water companies in the Group and the wider Chinese market.

Joel Hagan, CEO at i2O, commented: “China is a huge market. It faces the same challenges as other parts of the world in water supply and distribution but is unique in the rate of growth of its population and the cities that accommodate them.”

“We are lucky to have GDH Water as a client and partner. It is a rapidly expanding and ambitious group that is committed to smart solutions to deliver a reliable and cost effective supply of safe drinking water. It has clear social responsibility and profit motives, being quoted on the Hong Kong stock exchange. We look forward to working in partnership with them.”

Qifeng Tan, Director and General Manager at GDH Water, commented: “i2O is a company that we have watched with interest over the last few years. During that time we have evaluated a number of local and international companies supplying loggers and advanced pressure management. We have found i2O to have the most accurate robust and reliable offerings. We look forward to working in partnership with i2O to deploy its solutions in our companies, to bring them to the wider China market and to collaborate more closely in the future.”

Guangdong Investment Limited, the parent company of GDH Water, is quoted on the Hong Kong Stock market. Last year it turned over HKD10.46 billion with a profit of HKD4.65 billion. There are 28 water companies in the Group in four of China’s provinces, serving a population of 38 million people. It is China’s second largest water treatment company.

Page 7

Page 8: WIPAC Monthly - June 2017

Ofwat calls on water companies to have robust data strategy

Water companies must be ambitious in the way they use data to benefit customers, and ensure they have a robust data strategy in place, Ofwat has insisted.The regulator said data presents “massive opportunities” for the water sector.

“The water sector is sitting on an almost universal property data set because it supplies services into almost every household and business property in the country, services that are used by literally every person in the country and go to the heart of how people live,” said Ofwat chief executive Cathryn Ross.

“The potential in that data to improve lives, to drive innovation, enable efficiencies and improve resilience, is huge.”

Speaking at the launch of Ofwat’s report – unlocking the value in customer data – Ross said the regulator had identified four issues which will enable companies to make the most of their data.

As well as implementing a good data strategy, companies must “be smarter” in the way they use data. To do this, they must understand and have knowledge of the data they hold.

She said accuracy and high-quality data is “essential”.

“In April when we opened the business retail market, allowing business customers to choose their water provider, companies had to do a lot of work to get their data sets up to a certain quality. More work than we expected,” she added.

“This was very concerning to us and a sign that companies may have neglected their data sets for too long. Next year, new general data protection regulations will be implemented in the UK and there are huge implications for water companies.

“Not only will they give customers more control over their data, they will raise the bar on things like consent as well as allowing us as regulator to penalise those companies who are not in compliance.”

Ross also said data security was of paramount importance. “We see that these steps towards better data management is a journey for water companies,” she said. “Good quality, well understood and secure data builds a strong foundation from which to build a data strategy.”

A copy of OFWAT’s report is available on the OFWAT website by clicking on the following link

OFWAT’s expectations for the Water Sector (abstracted from the OFWAT report Unlocking the value of customer data)

Strategy, Knowledge, Quality & Security are key to the use of data

Page 8

Page 9: WIPAC Monthly - June 2017

UK government’s own Cyber Essentials scheme suffers breachAn article on leading global online tech publication The Register suggests that the consortium supporting the UK government’s own Cyber Essentials scheme has suffered a breach – the incident has been reported to both the Information Commissioner’s Office and the National Crime Agency.

Launched by the UK Government in June 2014, the Cyber Essentials scheme is a cyber security standard that organisations can be assessed and certified against.

All suppliers bidding for government contracts which involve handling of sensitive and personal information and provision of certain technical products and services must be compliant with the new Cyber Essentials controls.

The article says that the email addresses of consultancies registered with the IASME Consortium, one of six accrediting bodies for the Cyber Essentials scheme, together with the addresses used to apply for an assessment and company names may have been released to a third party.

According to The Register, Dr Emma Philpott chief executive at the Consortium sent a notice to companies yesterday via email which stated:

“We would like to make you aware that, due to a configuration error in the Pervade Software platform we use for Cyber Essentials assessments, the email address you used to apply for an assessment and your company name may have been released to a third party.

“We would like to make it clear that the security of the assessment platform has not been compromised. Your account, the answers you provided in the assessment and the report you received are secure. No information other than your email address and your company name was accessible to the third party.”

The Pervade Software assessment platform is used by the Consortium and its certification bodies.

The notice continues:

“An unknown person accessed a list of email addresses in a log file generated by the Pervade assessment platform and your email address, company name and the IP address of the Certification Body was on that list. No other information was accessed. The other information on the assessment portal itself was not affected in any way and no-one has accessed the system, your account, the answers you provided or the report you received. This log file became accessible through a configuration error on the part of one of the Pervade systems engineers. Pervade have taken immediate steps to address the error and have resolved the issue.”

The companies have been warned to be cautious of phishing emails which could potentially contain malware which purport to come from entities linked to the Cyber Essentials scheme, including the IASME Consortium.

Currently neither the Consortium nor the Pervade websites appear to have published any public statements about the breach.

“Prime example of how all organizations need to focus on fundamentals - people, process & technology”

According to Paul Edon, Director of Professional Services (International) at Tripwire, the news that the government’s own Cyber Essentials scheme had suffered the breach is a prime example of how all organizations need to stay focused on the fundamentals; the people, the process and the technology.

Edon commented:

“The fact that a Cyber Security organisation can lose personal data through a misconfiguration should act as a stark reminder to us all just how relevant security fundamentals remain. File and System Integrity Monitoring is regarded to be one of the foundational security controls that form the basis of a technical security strategy. This breach is a prime example of how all organizations need to stay focused on the fundamentals; the people, the process and the technology.”

“Educating the workforce can hugely reduce the risk of successful cyber-attacks using vectors such as phishing and URL drive-by, it can also help users identify unusual system activity that may result from malicious action.”

“Incident Response is just one example of where a well-defined and regularly practised process can make a huge difference to the outcome of an incident, possibly preventing that incident from becoming a breach.”

“Technology forms a large part of the Foundational Controls necessary to support a defence-in-depth security solution. Encryption, dual factor authentication, vulnerability management, change management and mail/web filtering are just a few that come to mind. The most effective method to determine which foundational controls are required is to carry out a full and comprehensive risk assessment of the business, systems and data.”

Page 9

Page 10: WIPAC Monthly - June 2017

Wessex Water acquires automated energy tariff switching service firmWessex Water has acquired Flipper – an innovative automated energy tariff switching service to help energy customers save money on their bills. The water company has plans to further develop Flipper to ensure customers don’t overpay for utility services.

Unlike traditional price comparison sites, Flipper automatically compares tariffs on a regular basis, ensuring that customers are always on the best possible deal.

The service, which launched in January 2016, charges a small annual membership fee and ensures Flipper’s interests are aligned with customers by not taking commission from suppliers.

Flipper continually scans the market and switches its customers automatically up to four times a year.

Wessex Water said the acquisition underpins its commitment to help ensure customers get value for money, and an easier way to manage their utility services.

David Elliott, Wessex Water Group director of strategy and new markets, commented:

“Both Wessex Water and Flipper have the same vision to ensure customers are provided with excellent and affordable services.

“Flipper works by putting the customer first and takes away the pain of assessing complex tariffs and pushing commission-led deals. Quite simply, it ensures you’re on the best tariff and does all of the hard work for you. It’s all about doing the right thing for the customer and offering both value for money and a hassle free way of managing utility bills.

“It was clear from when we first met that Wessex Water and Flipper share a common vision.”

“We have exciting plans for further developing Flipper that ensures customers don’t overpay for utility services.”

Existing investors in the business Nigel Evans and Stephen Smith have committed to remaining at Flipper and helping to drive the next phase of Flipper’s development.

Nigel Evans has been Chairman of Flipper since formation and has been the lead investor in the business. Stephen Smith joined Flipper as a Director and major investor within months of the company’s formation.

He is currently Director of Competition and Regulatory Strategy at Lloyds Banking Group where he leads LBG’s retail competition team in the CMA retail inquiry. He has over 20 years’ experience in the energy industry including seven years as an Executive member of the Board of energy regulator Ofgem.

Nigel Evans, chairman of Flipper, welcomed the deal and added:

“Today marks a major milestone for Flipper and a huge vote of confidence in our innovative and disruptive business model to transform the energy market.

“Under the deal, Wessex Water has committed to invest significantly in the business. This will enable us to rapidly grow customer numbers in energy.

“We will also look to respond to the feedback from Flipper customers and expand our automated switching service to other household bills.”

Click here to access the Flipper Website

‘Water Industry is looking for bigger plants, municipals for smaller modular plants’‘Water Industry clients are looking for bigger scale water treatment plants, while municipals are looking for smaller modular plants’, said CEO Menno Holterman at Nijhuis Industries in his keynote presentation at Blue Tech Forum. Holterman discussed key industrial market trends, addressing issues such as new emerging business models, water re-use and the internet-of-things.

As CEO of the Dutch water technology supply company Nijhuis Industries, Menno Holterman has the opportunity to experience water challenges across the globe. Nijhuis Industries, founded in 1904 and headquartered in the Netherlands, provides consultancy, design & build, maintenance and operational services for industrial and municipal wastewater projects on every continent.

‘We see clients now actively asking for integrated solutions, including finance, operations and maintenance, which is a seismic shift’, Holterman said. ‘We also see a lot of industries, making water reuse, for example, mandatory.’

‘It’s only a matter of time before some clients start reuse across production,” Holterman continued, ‘especially in areas of water scarcity. There are still a lot of brands that do not want the product water to be originated from treated effluent, so they want to discharge the water into a river and pick up the water hundreds of meters downstream, take it in, treat it and use it as product water. But to take it directly from the effluent treatment plant back into the process or mix it with water originated from treated effluent is a no-go area, especially in the food, pharma and cosmetics industry, for mostly psychological and marketing reasons.’

Internet of things

In his key note Holterman also mentioned the internet of things. ‘This sensor technology is going to help us tremendously and we are already going to the next stage in developing software which will help us to predict the behaviour of the plant if the characteristics of the water are changing”, he said.

Nijhuis has i-Monitoring devices in place with clients around the globe. ‘This data is also a very valuable resource to verify the design criteria of the extensive range of Nijhuis technologies and provide critical input to our R&D, design and process engineering teams ’, Holterman assured.

Page 10

Page 11: WIPAC Monthly - June 2017

Predictive analysis pays off for Welsh Water

Data scientists at Welsh Water built an analysis tool that can predict when service reservoirs are at risk of bacterial water quality compliance failures, and shaped it into an easy-to-use dashboard for its operational staff.

The Service Reservoir Bacterial Compliance Model uses data from all the factors which can contribute to water quality failures at service reservoirs – such as the level of chlorine in the water, the bacterial colony count from samples, temperature and asset condition – and calculates the risk of failure from this. This is then expressed as a single, colour coded figure that gives a spur to action for workers at the utility company to carry out remedial action and investigations to protect against such a failure occurring.

Bacterial compliance failures are caused by the detection of bacteria such as E.coli or coliforms in the regular sampling regime overseen by the DWI. Such incidents usually mean that a service reservoir must be drained in order to investigate the cause, leading to significant costs to the water company, reputational damage and potential disruption to customers’ water supply.

The predictive model can help Welsh Water reduce the risk of non-compliance by prompting the workforce to carry out proactive maintenance, raise the level of disinfection or undertake other interventions.

Dwr Cymru Welsh Water has a data strategy programme called WISER (Welsh Water Information Strategy Enterprise Roadmap) which supports the predictive model and other such data initiatives across the business. This strategy (see column, right) provides an effective data governance framework that orchestrates people, process and technology to enable the leveraging of data as an enterprise asset.

A huge amount of data is available to water companies on their operations, and increasingly, the challenge for utilities is how to bring this data to the attention of time-pressed operational staff in the most timely, succinct and digestible fashion so they can act upon it.

If used well, today’s data tools have the power to not only give information about what is happening and to analyse it for patterns, but also to use these patterns to predict the future and to prescribe actions to improve matters – this use of data is integral to the modern vision of a ‘smart network’ for water.

For this reason, data has been described as water industry’s biggest resource, with much attention being paid to the best way to harness this resource to the benefit of customers, employees and companies.

The Application

The man behind the Bacterial Compliance Model is Kevin Parry, Principal Statistician at Welsh Water, who began it in 2015 as part of a Masters Degree he was completing in operational research and applied statistics.

The first step in the project was a literature review and research exercise, to establish a list of all possible contributory factors to a bacterial compliance failure in service reservoirs (SRVs). The laboratory testing results themselves provided the most obvious data source, giving information on water temperature, bacterial colony counts and residual chlorine levels. However, other asset data - such as the source of the water used and the distance between the treatment works and the service reservoir – were also used, along with rainfall data.

The further water travels between treatment works and the SRV, the more likely it is that chlorine efficacy diminishes; rainfall is relevant because bacteria may enter the water through structural issues, such as cracks in the roof of the SRV (ingress).

However, one area in which data was insufficient for inclusion was asset condition: while Welsh Water employees were carrying out regular condition checks every three years, the documents that were being filled out were not in a form that could be easily harvested for data analysis in the model. In response to this finding, the company has started using a new e-form with simple drop-down menus for such visits, the information from which is automatically fed into a database.

Before building the model, Parry used two stages of statistical analysis. First, univariate testing, looking for direct relationships between individual factors and failures; and second, multivariate testing, where multiple factors are examined to see how they relate to each other. The findings from these processes were displayed visually and sense checked by operational colleagues before being taken forward for use in the predictive model.

“This was very much a collaborative effort where we worked closely with our operational teams - they had plenty of buy-in and were very interested in the results as they were coming out,” says Parry. “They were in a position to say ‘that makes sense’ or ‘that’s a bit surprising’ or whatever it may be. They were very involved in that process and I used data visualisation techniques to make the more complex results as understandable as possible for them which benefited both parties.”

The strongest factors that emerged as predictors of failure were the residual amount of free chlorine in the water, and the total colony count of bacteria present. On the other hand, rainfall was found to be less significant.

For building the predictive element of the model, Parry had the choice of using any of 12 different machine learning and sampling techniques. The chosen method was eventually dictated by the imbalance of the dataset: Parry was able to draw on six years of data in total, running from 2009 to 2015, but the number of failures in this period was still relatively small. To remedy this imbalance, a method was used which created additional synthetic observations to add to the dataset, before a more standard classifier based on regression analysis was applied.

The model produced gives a single metric for the risk of failure at each SRV, and was found to be able to predict failure with an accuracy of 70%. The next stage was to present this in an interactive tool which would be a spur to action for operational staff. This took the form of a ‘dashboard’ with the level of risk at each service reservoir represented as red, yellow or green; operational colleagues can see instantly where the points of greatest risk are by glancing at the colours

Page 11

Page 12: WIPAC Monthly - June 2017

on a map, and can also pull up a list of the five SRVs that present the highest risk in any geographical area.

“The advantage of using a single score is that our operational colleagues don’t have to look through lots of different reports, graphs and tables to make the decision about where the highest risk is,” concludes Parry. “It means that they can spend less time looking at data analysis and more time carrying out the remedial works that might be required.”

The User

“The bacti-predictor model has allowed my colleagues and I in North East Wales to take a strategic and scientific approach to managing service reservoir actions and to track performance across distribution areas. The tool itself is easy to use and gives us a great overview of data which otherwise would be spread over several different locations. I can compare sites and review several years’ worth of data.” Rosie Winter, Water Quality Scientist, Dwr Cymru Welsh Water

Wide Angle: Natalie Jakomis, Head of Data, Welsh Water

“Welsh Water is driving and funding a long-term permanent change in mindset and behaviour when it comes to data. We treat data as a corporate asset: it’s funded and managed just like all our other assets across the business.

“Historically, we’ve had data challenges from our corporate legacy systems, and we could have been better at getting access to data, extracting that data and generating insight from it in a timely and efficient manner.

“So what we’ve embarked on here is a data strategy journey, which we’ve called WISER - Welsh Water Information Strategy Enterprise Roadmap.

“It’s all about allowing Welsh Water to better exploit its data assets, so the insight that the data science team and the data teams across the business are able to generate can be turned into actionable behaviour more quickly and more effectively.

“We’ve undertaken workshops to look at which business areas require immediate data governance attention to improve our service: assets, customers and water quality came out as the three prioritised ‘data domains’. Within those data domains we are undertaking multiple workstreams: everything from business term definitions to classification standards and data quality key performance indicators.

“Each domain has data owners - who are accountable at a very high level for the trustworthiness and safeguarding of the data – and data stewards, who are the ‘doers’ who are directly involved with it. In the past, members of the data team have spent a lot of time improving the quality of the data, and mapping and aligning data across systems – but that’s not where their time is best spent. So as part of WISER, we’re also looking at improving that data quality automatically, by ensuring the data flows correctly across different systems.

“In a nutshell, WISER is about getting the right people involved, at the right time, in the right way, using the right data, to make the right decision, and ultimately leading to the right solution, to earn the trust of our customers every day.”

Networks make the world go round (not money!)Chris Jones:People, families, communities, media, transport, communications, information, health, education, finance, commerce, industry and, of course, utilities –they are all about networks. Service providers need to understand how effective their networks are and what quality of service they are delivering. Historic performance information is useful in identifying long-term deterioration and risk of service failure and for designing remedies.

However water companies increasingly aspire to use network information to understand what is happening now, to deliver insights into likely service levels over the next few hours and to drive real-time interventions to deliver great service in the most efficient way.

Given that we have been talking about real-time data and ‘smart’ water networks for at least the last decade, why does sensing for networks still feel novel compared to the extent of monitoring across our processes and assets? There have been various barriers preventing the uptake of sensing for water networks, but recent developments are helping to overcome some of these:

Availability of instruments to deliver real-time information at an affordable cost of ownership. We have driven instrument manufacturers to replicate lab-quality measurement in the field and to deliver high volumes of data across energy-demanding communications platforms. A number of factors are set to help resolve these tensions: battery technology continues to reduce the size and cost of energy storage while increasing the capacity and communications technology offers alternatives such as low power radio networks; more significantly we are realising that relative trends in service quality offer valuable insights and that maybe lab-quality measurement is not always required.

Capability of systems to store, manage and analyse vast quantities of data and to deliver actionable insights in real time. The advance in computing technology is clear for all to see: ‘cloud’ storage and processing able to securely handle vast quantities of data, internet of things enabling device to device communication, machine learning offering automatic recognition of patterns and anomalies, visualisation tools delivering information effectively to users, automated network control systems delivering cost and quality benefits.

Unclear business case for network scale monitoring. To some extent progress in delivering affordable sensors and data processing solutions will close the cost- benefit gap. However, genuine barriers remain: we are dealing with extensive legacy infrastructure that is difficult to access, meaning that retrofitting sensors is not trivial. Physical intervention options are constrained by resource availability and practicalities; if real-time action is not possible then the value of real-time information reduces.

At Sensing in Water this year we will discuss these issues and others in the session on sensors for networks and infrastructure see www.swig.org.uk for further details.

Page 12

Page 13: WIPAC Monthly - June 2017

Water Utilities Must Embrace Innovation Or Risk Losing Sustainability Dividends, Says Arcadis Report

Utility innovation is pathway to sustainability More than half of water utility leaders globally have yet to embrace innovation and risk missing out on important “sustainability dividends,” according to a new Arcadis report, Empowering Water Utility Innovation. Arcadis, the leading global design and consultancy firm for natural and built assets, unveiled the report in advance of the annual American Water Works Association’s Annual Conference & Exposition in Philadelphia June 11-15

Arcadis surveyed 423 utility professionals across 82 urban water utilities and discovered that only 40 percent engage innovation as a business practice. Yet more than 90 percent of respondents said it is critical to the future of their utility.

In the report, Jason Carter, Delivery and Innovation Lead for Arcadis North America, reveals how innovation generates measurable ROI while resulting in social, environmental and economic benefits. These benefits strengthen a utility’s brand, bottom line and satisfaction ratings to ultimately improve quality of life for its customers.

The report further points to innovations such as stormwater harvesting, advanced metering and real-time system monitoring as pathways to generating sustainability dividends. These dividends equate to greater revenue capture, improved demand management, waste reduction or increased asset longevity, to name a few.

Carter explains, “By building a culture of creativity, investment, experimentation and incubation, utilities can deploy innovation to foster new approaches to serving customers, managing facilities and funding infrastructure improvements. Innovation enables utilities to effectively engage internal and external resources to continuously improve operations and increase value for their customers through improved system resiliency, efficiency and quality, the three elements of water sustainability. Ultimately, it is innovation that leads to sustainability dividends.”

The report provides a “Utility Innovation Framework,” which comprises eight disciplines for creating a culture of innovation, such as focusing on defining challenges that guide investment, engaging stakeholders in transformative programs, reaching out to external resources and communicating success. The Arcadis report builds on a 190-page industry guidance manual, “Fostering Innovation Within Water Utilities,” which was recently published by the Water Research Foundation (WRF) and Water Environment & Reuse Foundation (WE&RF) with Carter as the principal investigator.

The Arcadis report also provides several case studies, including how three utilities embraced innovation to reap sustainability dividends.

Innovation: Ultrasonic algae control – American Water, a utility headquartered in Voorhees, New Jersey, piloted new technologies to drive improvements. The utility tested four ultrasonic algae control units in a reservoir in New Jersey, allowing a nearby reservoir to serve as a control. The six-month evaluation delivered sustainability dividends by showing effective control of the algae growth and no evidence of taste or odour problems or algal toxins. An economic assessment revealed the units saved approximately $87,800 in operational costs, with a projected payback time of 1.8 years for the system.

Innovation: Real time intelligence – According to the WRF/WE&RF guidance manual, Toronto Water used real-time intelligence from smart systems to control energy cost variations, planned/unplanned equipment downtime and demand/storage variations for greater system resilience. Sustainability dividends were achieved with cost savings from power projected at US $1.2M, and more coming from optimizing variable spot market energy rates.

Innovation: Intelligent water networks – Florida’s Jacksonville Energy Authority water division is using acoustic sensing technology to digitally assess pipe conditions, according to the WRF/WE&RF guidance manual. The utility uses an intelligent water network to deploy robust sensors into the network to generate actionable data while allowing for monitoring and actively managing performance in real time. This ability to accurately identify problems and immediately respond helps utilities target resources while providing sustainability dividends like eliminating outages and sewer flooding.

The report can be downloaded by clicking here

Page 13

Page 14: WIPAC Monthly - June 2017

Feature Article:

Empirical modelling in the water industry – a primer

Abstract

Empirical models are one of the most useful tools available to engineers and scientists in the water industry. Their use, however, is often curtailed by incorrect perceptions or (even worse) incorrect development and use. In this paper, which is a primer rather than a detailed theoretical description, empirical modelling techniques and how they can be used in the water industry are introduced. The first part of the paper summarises well-known empirical techniques and gives guidance on common pitfalls. In the second part of the paper several case studies from the author’s own work are presented .

Introduction

In the world of mathematical modelling there are theoreticians and there are practitioners. There are modelling pragmatists and modelling dogmatists. There will always be those who do not believe a model is valid unless it complies with all the relevant theory for that particular subject. There will, however, always be those that know some theory is more practically useful than others. I put myself firmly in the pragmatic-practitioner class. I’m a chemical engineer, not a mathematician or a statistician. I have, however, worked with modelling techniques, and particularly empirical approaches, with real plant data for 24 years, producing tangible solutions in practical situations. In this paper, which should be viewed more as a primer than a detailed mathematical treatise on the subject, I’d like to share some of my twenty four years’ of experience using empirical techniques in the water industry, an area where they are often overlooked and misunderstood.

What are Empirical Models?

Any engineer, whether academic or practical, has seen process simulation software. In the water industry many will have seen GPS-x or Biowin. In the chemical and other process industries examples include Aspen Plus and ChemCAD. The types of models those packages include are mainly deterministic. That means they are based on the fundamental theory of chemistry, physics and biology that describes each process. In the chemical industry that might be a distillation column model that uses vapour-liquid equilibrium and thermodynamic theory. In the water industry it could be an activated sludge model based on biological growth concepts.

We like that sort of model. They turn engineering theory into something we can use to predict plant performance. Sometimes the theory isn’t well understood, or it’s not known at all. Assumptions are made to overcome such hurdles but assumptions can be an arbitrary fix until something better comes along and you could have a long wait for a better alternative. How practical is it to describe every possible reaction pathway in an activated sludge plant?

Empirical models are very different. They do not rely on theory, they rely on real data. Simplifications are not needed because the models are not based on incomplete theory. Empirical models, as their name shows, are based on experience. This type of model forms relationships and equations between the variables used as its inputs and uses those relationships to predict something we’re interested in. Take a wastewater example: an empirical model of an activated sludge (AS) plant might use the composition and flow of settled sewage to predict the BOD5 leaving the AS plant.

Empirical models are things like regression equations and neural networks. Sometimes we call them black box models because we see a set of data go into the model and we see a prediction come out but what goes on in between doesn’t look like accepted process theory. That’s correct, it isn’t process theory but, with diligence, models that are just as good as those based on process theory can be developed.

Examples of Empirical Models

A. Simple linear regression

The first type of empirical model most people come across is the standard straight line regression, an equation we’re all taught as:

y = mx + c (1)

y is the thing we want to predict. x is the model input, m is a coefficient and c is a constant. There we go. Nice and simple. We’ve all used them. And I’m sure, me included, we’ve all abused them as well. Equation one is only ever useful if there’s a decent relationship between y and x. If there isn’t then models based on equation one are useless.

Equation one must be one of the most familiar and yet most misused pieces of empirical modelling theory. Take a look at graph one below. This is a real relationship derived from cost data used in the UK water industry. The line of “best fit” clearly isn’t very accurate (the r2 is only 0.15) and yet it was used for some time as a means of estimating GRP vessel costs based on their volume.

My advice: Simple linear regression is useful for giving a first impression of the strength of a relationship between two quantities but it’s only practically useful if you have two highly correlated variables, with one being easy to measure and the other less so (think COD and BOD5). Graph 1 An inaccurate simple linear model

Page 14

Page 15: WIPAC Monthly - June 2017

B. Other simple regression procedures

Most spreadsheet packages offer some form of regression analysis. This is both useful and dangerous. If equation one doesn’t work then let’s keep trying until we find something that does! There! A logarithmic line of best fit “works”! But are you sure? How do you know it will always work? Is that relationship plausible? As with the simple linear regression models, other less complex procedures must also be used with great caution.

Other procedures that can be classed as simple include exponential, logarithmic, power and polynomial regressions. All of these are usually included in standard spreadsheet software. Care needs to be exercised with all of these: an exponential relationship might fit one data set but when more measurements are collected you might find a different approach gives a better fit. If an exponential or power regression gives a clearly accurate fit to the data, great. Always bear in mind the error of the regression, however, and always ask yourself if that line on the graph really does point to a meaningful relationship. It’s amazing how putting a line through a set of data gives the impression that two quantities are related (see graph one!).

Nowhere is this more important than with polynomial models. I’ll nail my colours to the mast here: I never use them and I discourage others from using them too. They are particularly prone to something called over-fitting. That occurs when an empirical model approximates a set of data so closely it cannot generalise to new data sets. In other words, the model is only really valid for the data set used to develop it. Using it on any other set of data is risky and so the model is effectively useless. That’s a big generalisation and I’m well aware of highly complex types of polynomial that are used successfully in many applications. The simple polynomials, however, the type that are included in spreadsheets, are things I usually avoid as my experience shows them to be unreliable (this is an oft-cited limitation of this type of model [1])

My advice: If you have some simple pair-wise relationships to analyse then by all means test other types of simple regression. Always critically appraise the fit of the model and the plausibility of the relationship, however. Is that exponential curve really as good as it looks? And always, always, use polynomial models with great care or find an alternative.

C. Something a little more complex

The next stage of complexity from simple regression techniques is the multiple regression model. This extends equation one to look like equation two.

y = m1x1 + m2x2 + …..mnxn + c (2)

y is still the thing we want to predict. The parameter c is still a constant. Multiple regressions have more than one input, though, and these are the variables xn. Each model input has its own coefficient, mn. This starts to reflect the real world a little more doesn’t it? The behaviour of something as complex as an activated sludge plant is due to many simultaneous interactions between a wide range of quantities. Flow, dissolved oxygen, nitrogenous and carbonaceous loads, alkalinity: they all play a part. But is a multiple linear regression the best way to model such a process?

Recall that this paper isn’t a detailed examination of empirical techniques and I’ll avoid complex mathematical or statistical discussion. Having said that, there is one distinction it will be good to make here. Multiple linear regressions (MLR) are, well, linear. The relationship between the parameters of the model is mathematically linear. This is something of an approximation of the real world. A working wastewater treatment works or water treatment plant generally does not behave in a well-ordered linear manner. Many processes, even relatively simple water or wastewater processes are nonlinear. That might mean the relationship between a pair of variables is based on something like a squared term or an exponential. It might also mean that one parameter is affected by several others,where each relationship is complex and depends on multiple nonlinear terms.

Consequently an MLR model simplifies the relationships that can be used to predict a particular quantity. It straightens them out and tries to make them easier to interpret. Here lies one of the biggest pitfalls of empirical modelling: interpreting the coefficients (the mn values) of an MLR model.

There are many, many highly detailed text books that discuss the interpretation of the coefficients in MLR models. I won’t repeat that type of material here. It’s enough to say that the size of the coefficient and its sign can impart information about the relationship between the model inputs (the xn values) and the model out (the y value). It is enormously tempting to analyse the coefficients and draw conclusions from them that seem to give hitherto unknown insight into the mechanism you’re modelling. At this point I’d urge anyone who works with this sort of model to read the paper by Peter Kennedy of the Department of Economics at Simon Fraser University in Canada [2]. While it’s chock-full of statistical terms like heteroskedascity, its general point is compelling: there are many reasons why the coefficients in an MLR model might not actually be correct. Consequently, interpreting the coefficients comes with significant risk.

Does this mean MLR models are to be avoided? Absolutely not. A well-constructed MLR model can be very powerful if you apply engineering judgement at every stage of its development. It’s always wise to subject empirical models to a sensitivity test when they have more than one input. Look at the range covered by the data and vary each input over that range, then observe the model’s response.

Let’s say your modelling headloss in a rapid gravity filter at a direct filtration water treatment works. The site add ferric chloride as a coagulant upstream of the filter. The data contains ferric doses of between 5 mg/l and 25 mg/l. Keep all other inputs to the model constant at their mean value and vary ferric dose from 5 mg/l to 25 mg/l. Is the resultant trend plausible? Should run time actually increase as more ferric is added? Wouldn’t it decrease if more solids are flocculated and the filters blind quicker? Or is there another mechanism involved, linked to colour or turbidity (or both)?

My advice: Don’t rely solely on interpreting the coefficients of an MLR model to give you brilliant new insight into a process. They could easily be misleading but, if the model is well-constructed and the data well-collated, they can be instructive. Always put your model through a sensitivity test to see if its responses are plausible.

D. Nonlinear models?

If linear models are a simplification of the real world then what does the “real” model look like? Put simply, if it’s not a linear model then it must be nonlinear (hybrids are available). It’s perfectly possible to develop nonlinear regression (see equation three below).

Page 15

Page 16: WIPAC Monthly - June 2017

y = c1x1/(c2 + x22 + e(x

³3)/2) (3)

Here the c parameters are coefficients and the x parameters are variables. Your equation can take any form you think fit but calculating the coefficients needs a bit of mathematical grunt work. Equations can always be linearised but you need to think of the form of the equation first and who is to say it’s realistic? This iswhere engineering judgement is supremely useful. If you know that certain variables are influential, and you have some idea of the relationships then you can construct a plausible model relatively easily.

Sometimes it’s easier to forgo developing the model structure yourself and simply rely on a fixed structure. MLR models adopt a fixed equation format and an analog in the nonlinear world are neural network models (NNMs).

Many people have heard of NNMs. Gone are the days when it was only in The Terminator that NNMs got a public mention. Many large organisations use them for deep learning, machine learning and other artificial intelligence applications. Those terms in italics or used frequently these days but their origins lie in techniques such as neural networks.

Figure 1 shows the classic neural network structure.

Don’t be put off by the figure. NNMs are just a collection of simple mathematics. There’s nothing more complex in a trained NNM than a bit of addition, multiplication, subtraction and, if you’re feeling adventurous, the exponential or a trigonometric function.

The mathematical properties of NNMs have been studied for many years by many people, especially in process engineering applications. I won’t repeat that work here but it’s enough to say that NNMs are very good at approximating functions. In engineering terms that means they are a good tool to use when you need to develop an empirical model that relates one set of quantities to another.

NNMs crop up in all sorts of applications: number plate recognition, fault detection and diagnosis, image recognition, biometrics, handwriting recognition and many more. There are many examples of industrial control schemes that have been built using NNMs, exploiting the modelling accuracy the method brings [3].

Things aren’t all rosy in the NNM world however. Like MLR and other empirical approaches they need a decent amount of data to give an accurate model that applies over a useful range. Like MLR, the coefficients within the model can be difficult to interpret. They can be prone to over-fitting and care is needed to make sure that doesn’t happen. All of these problems can be overcome relatively easily, though.

A sensitivity analysis should always be conducted on an NNM. This is in lieu of being able to interpret coefficients. In fact, a Monte Carlo approach to varying NNM inputs can lead to very useful operating rules. In one recent case study we used a Monte Carlo approach to show that certain combinations of operating conditions led to undesirable process performance. That helped the plant operator to develop simple rules that prevented an undesirable operating point.

NNMs should always be developed using the train-and-propagate approach to avoid overfitting. Some software packages call it the early-stopping approach. The coefficients in an NNM are determined iteratively by passing each set of input data through the network in turn. Train-andpropagate waits until all lines of data have been passed through the model and all coefficients have been updated before a separate test set of data is used to test the model. The new set of data gives us what I call the test error. As training progresses, you’ll often see trends like those in graph two.

The training error (the blue line above) will often continue to decrease as the NNM overfits the data. However, you can see that the test error (the red line) reaches a minimum and then starts to increase. The minimum point shows where the model gave the best fit to the test data. Everything after that shows deteriorating performance on the test data, which means the model is becoming progressively worse at generalising. Most NNM software packages will terminate training at the lowest test error.

My advice: Don’t be put off by NNMs. They are a supremely useful empirical modelling tool for process engineering applications when used correctly. In 24 years of working with them using real plant data, I’ve only ever found one example where they were worse than an MLR model. Do a sensitivity analysis and always, always use the train and- propagate procedure and you’ll often end up with a very effective model.

E. Symbolic regression.

I’ve introduced simple and multiple regression as well as nonlinear regression, so what is symbolic regression. Let me illustrate with an example. In 1998 I was working as a process engineer in the UK water industry for one of the large UK water companies. I was asked to use my empirical modelling and data analysis skills to develop a model that could forecast failures of coastal water quality. I was faced with a data set containing 56 variables and several hundred rows of data. Aside from the multivariate analysis I did on the data (the subject of another paper) I developed an approach to isolating important variables, things that really influenced coastal water quality. Now, I’m no mathematical genius and I’m certainly no Alan Turing or Andrew Wiles but it turns out the procedure I developed had some similarities to what we now know as symbolic regression. What I did was this.

Figure 1 A typical neural network structure

Graph 2 A typical training and validation error trend

Page 16

Page 17: WIPAC Monthly - June 2017

I created a routine which randomly selected a number of variables from the 56 available (the number of variables differed on each iteration). Let’s say the routine picked five variables. It then randomly determined how many past values of each variable to use (things that happened on preceding days exert an influence on coastal water quality). It created an MLR, tested it on unseen data and repeated the process several hundred thousand times. I ended up with an enormous population of models, within which it was straightforward to pick out which variables appeared most frequently in the most accurate MLRs. It turned out that an independent study verified the results and, as part of a wider body of work, this randomised approach led to several important capital investment decisions.

Symbolic regression (SR) works in a similar manner but then takes the best bits of the best models and recombines them into a new population. For that reason it’s sometimes known as genetic programming because it relies on a “survival of the fittest” approach. SR starts by producing a population of models that is a random collection of variables and mathematical functions. Via complex rules it identifies which “bits” of each model are the most influential and “breeds” a new population that should be more accurate than the last.

SR has been around for a while now and it leads to some relatively simple one-line equations that are enormously powerful and which can offer real insightinto an otherwise poorly understood process. One of the case studies later in the paper is about using SR in a wastewater treatment example.

My advice: Seriously consider SR. it can produce highly meaningful models and give great process insight.

F. A quick summary

There are more types of empirical model than I’ve included here but these are the main ones. They all produce relatively simple equations or collections of equations that take one set of process variables and turn them into predictions of something else. They don’t rely on process theory explicitly but they do encapsulate that theory and site-specific idiosyncrasies implicitly within their coefficients. Techniques such as Monte Carlo analysis can be used to perform sensitivity analysis and derive meaningful rules from the models, dispelling the myth that such models are generally not as readily interpreted as deterministic models. Empirical models can also be used as part of online, real-time control schemes or as off-line simulation tools. They can be used to classify fault data (“You have a fault. Run for the hills”) or predict process performance (“the BOD5 over the next 30 minutes will move from 25 mg/l to 37 mg/l”). In short they have a multitude of uses. In the next section I introduce some examples of where they’ve been used in my 24 years of working with them. All the examples arefrom my career and they’re all based on real plant data.

Case Studies from Industry

A. Forecasting algal blooms

My first ever water industry empirical model. The problem here was forecasting when an algal bloom would occur in a lake used as a raw water source by a very large and important water treatment works (WTW). The alternative was a complex mass balance model based on biological theory. Using data about minor nutrients and the apparent and true colour of various rivers and streams that feed into the lake, graph 3 was produced. This is a five-day ahead prediction. In other words, the inputs to the model included values up to the present day but the output was the algal concentration at the works inlet five days on from that point. This was a great example of where empirical models could produce a cost-effective alternative to a more complex solution. (NB – the green line below is the measured data, the red is the model)

B. Forecasting raw sewage BOD5

We all know that BOD5 is both very useful and very frustrating. UK discharge consents are written in terms of BOD and it’s a jolly useful thing to use to analyse biological processes. However, it takes five days to measure in the lab so it’s useless as a feedback parameter. In this example the objective was to use variables that could be measured online if necessary to develop a model that could also be used online to predict BOD5. This approach, where readily available data is used to predict something that is otherwise tricky to measure with hardware, is sometimes called software sensing or soft sensing. Graph 4 shows the actual and measured BOD5. The average absolute accuracy was about 15 mg/l. Models that were more accurate could be produced when more variables were used. Again, the red line is the model, the green the actual data.

C. Coastal water quality

Now, this is one I’ve already introduced. I’ve had to remove the vertical scale from the graph for various reasons but it’s the match between the two lines that’s the important thing. This was an MLR model, the only one I’ve ever developed that was better than a nonlinear model. I think the reasons for that lie more in the randomised modelling method than anything else. The model produced no false positives (i.e. it never predicted a failure when all was fine). Similarly it produced no false negatives (i.e. it never forecast a quality pass when there was in fact a failure).

Graph 3 5 day ahead forecast of algal bloom

Graph 4 Actual and estimated raw sewage BOD5 trends at a UK sewage works

Graph 5 Actual and estimated faecal coliform trends at a UK coastal location

Page 17

Page 18: WIPAC Monthly - June 2017

D. Forecasting total phosphorus

A Symbolic regression example. This is very much like the BOD example, in that the aim was to develop something that could accurately forecast total phosphorus in a way that was cheaper than buying a bulky semi- online instrument. The process went through several thousand generations of models, each generation containing several thousand individual models. The total development time was measured in minutes and it arrived at the fit shown below.

E. Dual media filter run time

A nice NNM to finish with. A huge amount of multivariate analysis led to the identification of a range of raw water variables that influenced filter performance. The model also included variables showing the dose of various coagulants and polymers. In addition, it was a dynamic model, in that it also used past values of the inputs to add a time history of information to the model. The resulting model was highly accurate (this is actually the fit to a set of validation data that was not used at all in the development process).

Conclusions

These techniques are generic. They can be used in any area of any industry as long as there’s sufficient data with which to develop them and as long as that data is analysed correctly before the models are developed. Water treatment, wastewater treatment, sewer networks, coastal water quality; I’ve applied empirical models in all of those areas and more, delivering real, tangible benefits along the way.

This paper is nothing more than a primer, an introduction, a summary. The fine mathematical detail is not what is needed here. I hope I’ve shown that empirical models are a useful tool for the water industry. Deterministic models have their place but if it’s real high accuracy you need, or if no theory exists to forecast what you want, empirical models are the method to consider.

References

1. Clinical research computing: a practitioner’s handbook, Prakash Nadkarni, Academic Press, Elsevier, 2016, page 87

2. Oh no! I got the wrong sign! What should I do?, Peter Kennedy, Simon Fraser University, Discussion paper, 2002

3. Artificial neural networks – industrial and control engineering applications, Kenji Suzuki, InTech, 2011About the Author

David is a chemical engineer who graduated in 1993 with a degree in chemical and process engineering. He worked at the University of Manchester Institute of Science and Technology (UMIST) between 1993 and 1997 as a Research Associate, carrying out practical research into the detection and diagnosis of faults in a five-storey pilot plant distillation column.

In 1997 David joined North West Water (later to become United Utilities) as a process engineer, staying with the company for nine years. In that time he had a range of roles covering all aspects of water and wastewater treatment, sewer networks, water networks and sludge treatment as well as investment analysis and reliability studies.

In 2006 David joined Anglian Water as a senior engineer dealing with industrial effluent and industrial water supply projects.

In 2008 David set up his own company, Blackwell Water Consultancy Ltd (BWC for short). BWC provides a wide range of trade effluent, water efficiency and data modelling services to all areas of the economy.

Graph 6 Actual and estimated total raw sewage phosphorus estimated by a GP model

Graph 7 Actual and estimated filter run time

Page 18

Page 19: WIPAC Monthly - June 2017

Article:

Lime storage silos – disasters waiting to happen?

Introduction

According to Hycontrol’s MD Nigel Allen, many lime storage silos at water treatment plants are disasters waiting to happen, putting lives at risk and posing serious threats to the environment. Water companies are already under pressure to minimise the impact of treatment works on the local environment, especially in terms of odour and pollution. The potential for dust pollution from storage silos with ill-equipped protection systems adds another dimension to this. However this threat is totally avoidable.

Powdered lime is used during the treatment of waste water to reduce odour in raw, primary sludge, as a cost-effective alternative to using digesters. It is also used in other water treatment processes to helpbalance pH levels, and as part of composting processes for sludge removed from the bottom of primary tanks after it has been de-watered and compressed. The lime and remaining water in the sludge together creates a heated chemical reaction, accelerating the process.

Level measurement specialists Hycontrol have been designing specialist silo protection systems for over 20 years and have extensive experience of the potential problems that exist on sites, especially in the waste water industry sector. “Our findings are worrying to say the least and the photos taken by our installation engineers speak for themselves,” says Allen. “Companies just don’t seem to understand the consequences of poorly maintained protection systems. It’s quite frightening that operators accept pressure blow outs via the pressure relief valve (PRV), erroneously citing that ‘It’s OK - the PRV is doing its job’. This couldn’t be further from the truth - PRVs are there as a last resort. If the silo protection system is working correctly and is fitted with an automatic shut-off feature to prevent over-filling, the PRV should never be used. If a PRV blows then there’s an inherent problem with the system or the filling protocol and corrective action must be taken.”

“Material in and around a PRV is a tell-tale sign that there’s something wrong and a catastrophic blow-out is waiting to happen,” continues Allen. “The material blown out from the silos will almost certainly solidify over time and this will, at best, prevent the PRV from working correctly and, at worst, completely clog it up. Unfortunately many maintenance engineers just don’t realise the potential dangers that lurk beneath. They often think that simply cleaning off the material on and around the PRV is good enough. They don’t realise that if the PRV doesn’t lift next time an ‘event’ occurs, the over-pressure could easily rupture the silo or eject the filter housing from the top. On an ATEX rated silo the over-pressure could be sufficient to simulate an explosion and open the protective blast panels, resulting in costly loss of product and silo contents being left open to the elements.”

With regard to filter housings, Hycontrol engineers have witnessed another worrying practice at a number of sites where companies fit chains to prevent the housing being blown off the top of the silo, almost accepting the inevitable is going to happen.

What causes over-pressurisation problems?

Silo protection systems are designed to prevent the damaging and potentially dangerous consequences of silo over-filling or over-pressurisation when powdered material is being transferred pneumatically from road tankers to silos. Unfortunately, perched out on the top of silos, such protection systems are all too often ‘out of sight - out of mind’ - that is, until a major problem occurs.

Problems during the filling process usually arise through an inherent problem with the silo protection system or with the air filtration system on top of the silo. Problems can also occur through tanker driver/operator error. Delivery tankers are pressure-tested vessels typically capable of withstanding up to 2 bar (29 psi) pressure. Storage silos are designed to withstand the weight of material stored in them and can rupture at pressures as low as 1-2 psi above atmospheric pressure. The consequences of over-filling or over-pressurisation include:

• serious or fatal injury to workers and the public. • catastrophic silo damage• loss of material and production• harmful environmental pollution • damage to company reputation

A key issue with many silo protection systems is that without adequate ground level testing capabilities, operators don’t know if they will work when needed. Working at height restrictions limit silo top inspections and maintenance, especially in adverse weather conditions. However the main problem is: what can engineers actually do when they are at the top of the silo? And furthermore, how do you physically test a relief valve or pressure transmitter unless you remove them?

Even if the protection system does do its intended job and prevents a major incident, companies rarely investigate the root cause of the problem so that

Page 19

Page 20: WIPAC Monthly - June 2017

remedial work can be carried out to prevent the situation re-occurring. Important ‘near miss’ events such as PRV lifts, high level events and high pressure events are routinely not recorded and often conveniently dismissed. Hycontrol have clear evidence that in practice there are more ‘near misses’ than realised and that the situation is a ticking time bomb.

Filter housings at the top of the silos are designed to vent the silo during filling, whilst preventing dust escaping into the atmosphere. Normally these are fitted with some form of self-cleaning system to keep filters clear. These are typically mechanical shakers or reverse jet systems. Although filter manufacturers give recommended check routines and filter replacement schedules, in practice it would appear these guidelines are regularly ignored. Faulty operation can be caused by a range of issues, including blockages and the fitting of unsuitable or wrongly-sized filters. Most powders form hard compounds when mixed with water from the atmosphere, further exacerbating the problems at the top of the silo.Effective silo protectionThe MPA (Mineral Products Association) publishes comprehensive guidelines for silo protection systems in quarries and cement works, but there are little or no such recommendations for powder silos used in a broader range of industries including waste water treatment, food and beverage, chemical, and plastics. However the primary principles are the same for protecting any pneumatically filled silos.

Even with guidelines in place, the bench mark for the effectiveness of any silo safety protection system can only relate to the last time all the components were fully tested.

Optimum solution

The only effective solution is to take an integrated approach to silo protection design whereby the PRV, pressure sensor and high level alarm can be tested at ground level, prior to each fill. Only when all these safety devices have passed the checks should the safety interlock allow the silo inlet valve to open and the delivery to commence. The use of a ground-level test (GLT) system, as utilised in Hycontrol’s Silo Protection System, will also eliminate the risks of working at height.

As an added benefit, an effective protection system can serve as a powerful predictive maintenance diagnostic tool by recording critical near-miss events that occur during the filling process. This information allows managers to carry out effective predictive maintenance by means of a logical step-by-step root cause analysis (RCA) process to understand why the problems are arising. For example, high pressure and PRV lift events may be due to filter problems,prompting questions such as.

• Are the filters the correct size? • Is the filter cleaning regime fully operational?• Have the filter bags/cartridges been changed as per manufacturers’

recommendations?

In parallel the logs will also indicate if the tanker drivers are routinely over pressurising during the fill process.

In summary, the optimised silo protection system should incorporate:

• Pressure sensor, hi-alarm level sensor and PRV testing (essential)• Simple ‘1’ button press to test all components • Silo filling auto shut-off control• Pneumatic cleaning of pressure sensor• Recording of the number of events on incidents of over-pressure (time /date stamp)• Recording of the number of events of PRV lift and opening (time /date stamp)• Recording of the number of events of high level probe activation (time /date stamp)• Filter ON / OFF output option to check filter status• Filter air supply monitoring alarm option

Conclusion

There is strong empirical evidence that many silos are ‘disasters waiting to happen’. The practical reality is that powder storage silos can split or rupture at pressures as low as 1 or 2 psi above atmospheric pressure. Malfunctioning filter housings can be ejected at similar pressures.

Cursory visual inspections of silo protection equipment is woefully inadequate. Therefore it is imperative that any installed safety system must be capable of providing reliable protection that can be easily verified by testing critical components before each and every delivery – without having to climb to the top of the silo. This approach will provide total silo safety; protecting the surrounding environment, assets and, most importantly, site personnel and the public.

Page 20

Page 21: WIPAC Monthly - June 2017

Application Note:

Activated Sludge Plant Optimisation

& Aeration Performance

Upgrading an aeration system is one such challenge and can be a costly process. In order to ensure that an upgrade will provide the performance improvements and hence efficiency savings required to meet both compliance targets and payback timescales, pilot plant trials are a rapid way to generate process relevant data.

The treatment plant under consideration in this instance utilises jet aeration to provide the oxygen required to treat the influent, a blend of domestic and high strength industrial sewage. In order to reduce the energy demand on the site, the utility are investigating the potential to replace the jet aeration with fine bubble aeration systems (FBDA). Though the industrial flows in question are small, there is some concern that they may contain substances that could inhibit oxygen transfer thus limiting the efficiency improvements offered.

Testing the industrial inputs to the site for these components would be a lengthy and expensive process as there are a broad number of potential candidate chemicals. As such, a much more straightforward method has been used: design and installation of a pilot plant where any effects can be measured directly during the process.

The pilot plant consisted of a 6m high, 2m diameter tank with 5.6m water depth , installed adjacent to the existing sequential batch reactor (SBR) under investigation. The tank was fitted with FBDA diffusers at the same grid density as the proposed aeration system and a blower capable of providing a variable air flow rate proportional to the design air flow for the system. The pilot plant received Mixed Liquor Suspended Solids (MLSS) pumped from the SBR and aerated it to measure the actual oxygen transfer rate using the off-gas measurements of oxygen. Secondary calculations for rate of increase in dissolved oxygen (DO) were also performed.

For the pilot test to be representative, the MLSS concentration had to be the same as the main SBR. Incoming flows from the SBR displaced the volume within the pilot plant, entering at the bottom of the column with an overflow outlet at the top. To ensure that no build-up of MLSS occurred within the pilot plant, the pump only operated during the react period of the SBR, with the pilot plant only aerating when the pump is operating. This ensured the tank contents were completely mixed and the overflow MLSS was at the same concentration as the main pilot tank volume thus maintaining the same MLSS concentration as the main SBR concentration.

In addition to the review of the FBDA process in this application, the above trial also served as a performance test of the DO sensor itself. Three optical dissolved oxygen sensors were suspended at depths of 1.5m, 3m and 4.5m respectively within the reactor chamber. To enable remote monitoring of performance, the data was sent to a remote cloud server via a Storm modem unit housed in the local monitor cabinet.

This trial ran over the course of 7 weeks taking in phases of operation at a number of air flow rates and feed pump cycle times. The feed pump and blower only operated when the main site SBR was in the following modes:

• Anoxic Static fill• Anoxic Mixed fill• Aerated fill• React period

Figure 1: Pilot plant with DO positions at 1.5m, 3m & 4.5m (inset)

Figure 2: FBDA rig used in test Figure 3: Control & Monitoring Station Figure 4: Instrument data (both DO and temperature) is clearly seen on the monitor display

Page 21

Page 22: WIPAC Monthly - June 2017

Accurate DO measurements to enable aeration system upgrade

In order to ascertain any performance effects caused by the industrial component of the influent accurate and reliable dissolved oxygen measurements are critical as these measurements form the basis for the mass balance equations used to establish the components of the Alpha factor.

Critical to the design of activated sludge plants, the Alpha factor is the ratio between the mass transfer coefficient kLaO2 in the wastewater and kLaO2 in clean water. The Alpha factor varies depending on the aeration process, tank geometry and water characteristics, in particular the mixed liquor suspended solids (MLSS) concentration.

In general, aeration processes which are mixing energy intensive, such as jet aeration, have higher Alpha factors than those found in passive aeration processes, such as fine bubble diffusion. The calculation for the Alpha factor requires the rearrangement of the equation for the Actual Oxygen Uptake Rate (AOR) as shown below:

AOR = α SOR (βCs -Co) 1.024 (T-20)

(Cs20)

Therefore:

α = AOR / [ SOR (βCs -Co) 1.024 (T-20) ] (Cs20)

Where:

AOR = Actual Oxygen Uptake Rate, estimated through mass balance calculationsSOR = Specific Oxygen Uptake Rate, the performance of the aeration system at the air flow rateβ = 0.95Cs = DO saturation concentration at site temperature and altitude of 10.5 mg/lT = temperature 13.5oCCo = Operating DO concentrationCs20 = DO saturation concentration in clean water at 20° C

Focussing on the little things – innovation in optical DO

Optical DO is nothing new to the wastewater market, so true differentiation between products can be difficult. A review of fundamental sensor performance has enabled innovative improvements to be made by WTW.

Optical DO sensors classically have a membrane mounted flush to the bottom of a flat cap. This means that the sensor collects bubbles and sludge if suspended vertically. Mounting from a pole fixed at 45° can help this issue but greatly reduces the ability of the sensor to oscillate in the flow, thus removing a critical aspect of self-cleaning. A simple and elegant solution to this has been to mount the membrane itself at a 45° angle from the base. The sensor can then be suspended vertically from a chain enabling it to oscillate in the flow and enhancing the self-clean by the movement of the bubbles upwards across the membrane surface.

Oxygen measurements are made via a fluorescent dye impregnated into the sensor membrane. The amount of fluorescence which occurs under excitation is dependent on the level of oxygen present. Sensors in the market typically use a blue light to excite the dies and cause the fluorescence. However, this blue light bleaches the membrane limiting the effective lifetime of the membrane cap and thus reducing the service interval of the sensor. By moving up the visible spectrum to green light, WTW have minimized the bleaching effect without impacting sensor response and accuracy. This simple change increases the operational lifetime of the sensor caps thus reducing the operational cost of the probe.

Data Collection & Sensor performance

The data generated by the three DO sensors was sent out via a Storm-3 modem to the Storm Central cloud data network. Access details to this site location and data were provided to the project team within Xylem alongside specified personnel within the client organization enabling them to track performance and greatly increasing their buy-in to the project.

This data can be viewed directly in the GUI or downloaded for further investigation. The storm package is offered as standard for all wastewater demonstration and trial units from Xylem Analytics. An example of the data generated from the DO sensors is shown below as represented on the Storm remote cloud server.

Figure 5: The membrane mounted at 45° enhances self-cleaning

Page 22

Page 23: WIPAC Monthly - June 2017

As can be seen, the 3 sensors tracked each other very well during the plant operational cycles with very little noise on any of the signals. Due to their enhanced self-cleaning ability, no maintenance interventions on the sensors were required during the course of the 7 week trial.

Clean Data Is Crucial

Optimisation and control of unit operations within industrial and municipal treatment processes should always be thought about with continuous improvementin mind. Clean data is crucial for optimal process and control. It should always be carefully monitored and weighed to ensure accuracy, repeatability and reliability.

Monitoring dissolved oxygen levels within an aeration process accurately and reliably is key to meeting energy budgets and treatment targets. It is important to consider maintenance when selecting instruments to perform this task, particularly in environments where heavy fouling is prolific. Using an incredibly accurate instrument is ideal, but if the chosen location is prone to rapid fouling then the instrument will not cope and it will be in need of constant supervision. Results can become unreliable, the control inaccurate and optimisation impossible.

Figure 6: Data from the Storm platform showing that the 3 dissolved oxygen probes installed tracked each other very well during plant operational cycles and an example of the storm remote cloud server

Loughborough University receives a £1.7 million grant for flood prediction

Loughborough, Leics. (9th June 2017) – Loughborough University is proud to announce it has been given a £1.700,000 grant from the Engineering and Physical Sciences Research Council. These funds will support the research into the Emergency Water Information Network (EWIN).

Teams from the UK and Mexico are working on ways to monitor flooding using phone systems. Minutes of warning are important and modern flood warning systems are critical for a country to protect its people and develop and grow its towns and cities. Flooding is the most severe of natural disasters humanity has to cope with concerning the loss of life, and the long-term effects of flooding have severely adverse social consequences. Unlike the UK that has a good flood defence system, many countries have not been able to afford the technology and have a poor response to flooding. However, many developing countries do now have access to modern cellular phone networks in their towns and cities. So in this project, we will discover how to use mobile phone networks combined with WiFi to help countries tackle flood risk.

“Floods are unstoppable so giving people time to get away is critical. Assisting developing countries to spot the signs using their phone networks will make a big difference to communities in Mexico who are at flood risk”

The partners in this research are Loughborough University, the University of Mexico, the University of Colima, Dynamic Flow Technologies Limited and Siteldi.

Page 23

Page 24: WIPAC Monthly - June 2017

July 2017

Low Cost Sensors5rd July 2017University of Southampton, UKHosted by the Sensors for Water Interest Group

September 2017

Sensing in Water 201727th -28th September 2017Nottingham Belfry, Nottingham, UKHosted by the Sensors for Water Interest Group

October/ November 2017

Wetsus Congress9th - 10th October 2017Leeuwarden, HollandHosted by Wetsus

Innovation Brokerage Workshop9th - 10th October 2017University of Bath, UKHosted by the Sensors for Water Interest Group

In 2017 Hach will be hosting a series of webinars on key issues in the Water Industry. These webinars will start on 23rd May 2017 with their first webinar on Achieving Low Level P Limits in Wastewater Discharges which is being hosted by Stuart Ainsworth, Application Development Manager.

The remainder of the series will cover topics on

• Oxidation ditches: their control and the impact of Alkalinity

• Integration of control systems within wastewater treatment plants

• Swing zone control for Activated Sludge Plants

• DAF optimisation and Industrial process control

• TOC control for Anaerobic Digestors

Page 24

Conferences, Events,Seminars & Studies

Conferences, Seminars & EventsLow Cost Sensors

Where: Southampton UniversityWhen: 5th July 2017

The increasing interest in the Internet of things and the drive for ‘sensors everywhere’ has resulted in an increased requirement for low cost Sensors. This is beginning to become a reality partly due to increased global competition in the sensor supply market. Although sensor cost reductions in the water and environmental monitoring sector has been achieved in recent years, it is debatable as to whether this has been significant enough to impact on numbers of sensors fitted or deployed.

This SWIG Low Cost Sensor workshop will address where the Water Utility and Environmental Monitoring sectors currently stand in use of low cost sensors, and will provide some case studies demonstrating how water and catchment management have benefited so far from advances in this area. Examples from other industry sectors will also be presented to provide an insight into benefits gained within these sectors, and inform on how these can be translated into water monitoring requirements. Within this theme, the workshop will also include presentations from academics working on novel inexpensive sensors to provide some foresight into sensor systems for the (near) future.

Sensing in Water

Where: Southampton UniversityWhen: 27th - 28th September 2017

The 4th biennial conference and exhibition promises to be bigger and better than ever! In 2015, 180 people attended Sensing in Water over the 2 days of the conference, including 12 major water companies. 40 exhibitors presented their products and services in the exhibition, many of whom are repeat customers at our conference so don’t miss the opportunity to advertise and raise your profile to the water sensor community!

The theme of this years conference is “meaningful measurement from micro to macro” taking the application of sensor technology with the sessions over the two day conference concentrating on

• Sensor Design & Performance• Sensor application at the Treatment Works• Sensor application within networks & infrastructure• Sensor application within environmental catchments

With keynote speakers from Welsh Water & the Environment Agency and over 20 speakers in total the event promises to be an interesting few days. The latest draft programme is available here

Hach Webinar Series

Page 25: WIPAC Monthly - June 2017

Sensing in Water 2017

“Meaningful Measurementfrom

Micro to Macro”

27th - 28th September 2017Nottingham Belfry, UK

Page 25