unified data architecture

12
UNIFIED DATA ARCHITECTURE TDWI E-BOOK MAY 2013 1 EXPLORING TERADATA’S UNIFIED DATA ARCHITECTURE 3 Q&A: WHAT IS A UNIFIED DATA ARCHITECTURE? 6 WHAT BIG DATA IS REALLY ABOUT 8 Q&A: HOW STRATEGIC PLANNING CAN HELP MANAGE BIG DATA 11 ABOUT TERADATA Sponsored by tdwi.org

Upload: others

Post on 22-Oct-2021

13 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Unified Data Architecture

Unified data architectUre

TDWI e - book may 2013

1 Exploring TEradaTa’s UnifiEd daTa archiTEcTUrE

3 Q&a: WhaT is a UnifiEd daTa archiTEcTUrE?

6 WhaT Big daTa is rEally aBoUT

8 Q&a: hoW sTraTEgic planning can hElp ManagE Big daTa

11 aBoUT TEradaTa

Sponsored by

tdwi.org

Page 2: Unified Data Architecture

1 TDWI e-book UnIfIeD DaTa archITecTUre

exploring teradata’s Unified data architectUre

Big data doesn’t break with existing best practices, concepts, or methodologies; it involves, instead, a slight rethinking of the status quo.

Big data isn’t a revolutionary change, says Imad Birouty, program marketing manager with Teradata Corp.

Birouty and Teradata see big data as an evolutionary development. Big data doesn’t break with existing best practices, concepts, or methodologies, Birouty contends; it involves, instead, a slight rethinking of the status quo. Teradata’s Unified Data Architecture (UDA) is designed to extend Teradata’s platform vision—which is focused by the enterprise data warehouse (EDW)—to encompass big data systems.

Unified Data Architecture recasts how information gets ingested, managed, circulated, and persisted. It’s designed to accommodate new and emerging information use cases—including established practices, such as reporting, dashboards, and scorecards; business analytics, analytic discovery, and advanced analytics; and (of course) big data analytics.

“Our goal with the Unified Data Architecture is to make it easier for our customers to work with these [new] technologies in a consistent and governable manner,” he explains.

“That’s why we introduced our Aster Discovery appliance, that’s why we brought Hadoop into our stack, and that’s why we’ve continued to enhance the Teradata database, with [technologies such as] our new Intelligent Memory option,” Birouty continues.

It isn’t just about platforms, Birouty says. “There’s a lot of engineering muscle behind this. We are porting all of our technologies to [UDA]. That’s 30 years of knowledge and development and tools and utilities—all the things that our customers use in their data warehouse environments. We’re extending all of that technology to Aster and Hadoop. We’re tying these technologies together; we’re making them work seamlessly together.”

During the first wave of big data hype, it became fashionable to bash the EDW as a vestige of the past, but Birouty and Teradata believe the pendulum has now swung back toward the EDW. Birouty maintains that the EDW is a mature, efficient, and scalable platform for managing core business data. By EDW, Birouty and Teradata mean just that: an enterprise data warehouse; a single repository for business information.

Teradata’s isn’t an outlying view. Philip Russom, director of data management (DM) for TDWI Research, has argued that

“Enterprise data warehouses [are] not going away; they’re still killer platforms for the things they’re designed for, [such as] standard reports.” More to the point, Russom maintains, the EDW addresses fully 95 percent of business intelligence (BI) and decision support system (DSS) use cases, such as dashboards and scorecards; multidimensional reporting, via OLAP-driven discovery; and “the kinds of metrics and key performance indicators that you need for performance management.”

This isn’t likely to change, Russom indicates. “The list I just gave you is 95 percent of BI deliverables,” he concludes. “Why would you kill the thing that’s 95 percent of your job?”

Teradata’s UDA Expert Q&A Big Data Strategic Planning About Teradata

Page 3: Unified Data Architecture

2 TDWI e-book UnIfIeD DaTa archITecTUre

Analytic Heterogeneity

On the other hand, there’s a lot of buzz about the other five percent of Russom’s calculus.

UDA was conceived to address the changing requirements of analytics in the enterprise. Consider the analytic “sandbox” use case, which is closely related to analytic discovery. Discovery of any kind poses significant problems in a conventional DW environment. For example, how can data management (DM) practitioners accommodate the needs of business analysts and data scientists, who in many cases require access to raw, inconsistent, or unprepared data? How do insights get circulated—or recirculated—back to the EDW?

Teradata’s Unified Data Architecture addresses this problem in a couple of ways.

For the “sandbox” use case, UDA makes use of Teradata’s DW platform. “We have a tool called Teradata Data Lab that allows users to create this sandbox area within a Teradata system. It’s sitting inside a production data warehouse, but it’s sectioned off and assigned workload management priorities that don’t interfere with production workloads,” Birouty explains. “Think of it as this play area; all the experiments you’re doing and all of this experimental data are sitting in a production system, and you can join it with all the other production data. This provides a path [inside Teradata Warehouse] to then promote new findings into normal production processes.”

Unified Data Architecture uses Aster for analytic discovery, as well as for big data analytic scenarios. This includes both

“extreme SQL” analytics and advanced (non-SQL) analytics on unstructured or semi-structured data. “The Teradata database is a traditional relational database, but Aster brings in not only [in-database] MapReduce, but a graph engine and nPath Analysis; they bring a lot of these non-RDBMS-type functions to the table,” Birouty says.

“Not only does Aster have … up to 80 prebuilt functions … [but] companies can custom write their own. So if you’re a Java developer, you can write pretty much anything you want, and take advantage of the Aster engine and the parallelism. You can take what you just wrote in Java and give that back to your end users as a SQL function.”

Unified Data Architecture makes use of Hadoop in several ways. One involves using it as a kind of online queryable archive; aged, “cold,” or less-critical data can be shifted to Hadoop, which provides a cheaper storage layer than fit-for-purpose relational platforms. Interactive access is via SQL-H, Teradata’s implementation of a SQL-like query facility for HCatalog. (HCatalog provides metadata catalog services for Hadoop). Queryable archive is one compelling use case for Hadoop; an even more popular one, Birouty argues, is as a staging area for new data. Hadoop is a schema-free platform, which makes it ideal for storing unstructured information; however, it can also be used as a repository for structured data.

A Hadoop-based landing zone can also make it easier—and more affordable—to store raw data. “You can bring your data in [to Hadoop] before you even do the ELT. In this way, you can … afford to keep the raw data, which is important, because typically you’re only bringing parts of your data into the warehouse,” he says. “Some of [your analysts] might want all the data, and you’re just scratching your head, saying ‘I wish this table or customer address were included.’ There’s always things that you don’t bring in.”

All of this is to say that Unified Data Architecture is much more than the sum of its data or query processing parts, Birouty contends. “Our minds tend to go directly to the big technologies, to the Teradata [database], to Aster, to Hadoop—and that’s good, because that’s where the heavy lifting happens—but the magic of the Unified Data Architecture is all the stuff that glues it together,” he says.

“It’s Teradata Viewpoint that allows you to manage the multiple [constitutive] systems; it’s Teradata Unity that routes requests between Teradata systems; it’s Smart Loader for Hadoop that enables you to drag and drop data from Hadoop into Teradata; it’s SQL-H that enables Teradata warehouses or Aster queries to reach into Hadoop at runtime; it’s one-stop support for the entire environment. All of these different technologies that we bring to the table glue together Teradata [Warehouse], Aster, and Hadoop as a cohesive architecture.”

Teradata’s UDA Expert Q&A Big Data Strategic Planning About Teradata

Page 4: Unified Data Architecture

3 TDWI e-book UnIfIeD DaTa archITecTUre

TDWI: What challenges (or opportunities) does the big data movement bring to organizations and their data architectures?

Steve Wooledge: The big data movement has helped create excitement about the value that a data-driven culture and new analytic technology can bring to an organization. In many ways, big data is a discussion about what’s possible, and challenges the norms of business intelligence and analytics from the past. This elevates the status of technology and the role of the IT group to have a seat at the table for business strategy, and allows the business to say, “What could we do for our customers if we had better insight across all the data available to us, regardless of scale or the complexity of analytics required? Could we better predict machine failures, forecast demand in our supply chain, or personalize the experience of customers engaging with our company through a variety of interaction points?”

This is the opportunity. On the other hand, the challenge this creates is to prioritize the requests with the business and identify a project that solves a business problem and doesn’t spin up new “big data” technology just to say you can (build it and they will come). There are challenges around staffing, data governance, privacy, IT standards, and providing a self-service environment to the business users and applications.

Q&a: What is a Unified data architectUre?

We explore big data—including its challenges, opportunities, and the common mistakes analysts and data managers make using it—as well as the role a unified data architecture (UDA) plays in this technology environment with Steve Wooledge from the Teradata Unified Data Architecture product group.

Teradata’s UDA Expert Q&A Big Data Strategic Planning About Teradata

Page 5: Unified Data Architecture

4 TDWI e-book UnIfIeD DaTa archITecTUre

Take the challenge of staffing. Companies want to know: Do I go out and hire a data scientist to help me figure this out, or is there a way to extend the value of my existing analyst skill sets and get them access to new data and analytic techniques with some extensions to our existing data architecture? This kind of self-service empowerment allows users to answer their questions and discover new insights that can impact the bottom line without needing to hire specialists or retrain existing analysts.

Although there is hope and excitement around big data, there is no silver bullet that will suddenly make data management and analysis in the enterprise easy. Having reliable information that is consistent, accurate, and secure is important when you begin to operationalize new insights that innovations in big data analytics provide.

What are some of the common mistakes that data management or analytics professionals make when it comes to big data and gaining new insights from data?

Very often we see companies putting the cart before the horse. What I mean by this is that organizations have a technology in mind without yet having clear objectives on the business problem they are trying to solve. As technologists, we get excited by the latest innovations in our industry, but it doesn’t mean we can just forget the enterprise architecture and data governance requirements in our organization.

For example, we get requests from companies who want to learn about “big data” technologies. Our first question to them as advisors is, “How do you define big data? What are you trying to do?” Every company defines it differently. For some, big data is large volume, which Teradata has done for decades. Other companies think big data is anything to do with social media or Web interactions with customers that is both large in data volumes, as well as being non-relational data in the form of Web logs or other machine data. Some companies simply have a mandate from their CIO to have a Hadoop project in place, but when we ask them what issues they’re trying to address, there is rarely a well-defined project in place. It’s healthy that businesses want to challenge the status quo, but companies need to think through the implications of incorporating new technology across the business and architect their data and analytic infrastructure for the future end state.

What best practices do you recommend for overcoming these mistakes?

We recommend the same approach of any successful data management or analytics project: working with a competency center internally or engaging with an external trusted advisor to work in aligning technology capabilities with the business requirements. Start with the business problem or question that needs to be answered, determine what data is required and what the analytical process is, identify the staff and skills required, and then choose the right analytical tool or technology for the job. There has never been more innovation in the analytics space than what we are seeing now from a technology perspective, but it’s important to understand the

“sweet spot” of each technology component and understand how to integrate it into an information architecture that can meet service-level agreements (SLAs) with the business.

What is a unified data architecture?

A unified data architecture is a more comprehensive view of the overall enterprise architecture. It is a collection of services, platforms, applications, and tools that help customers define and deploy an architecture that makes optimal use of available technologies in a way that unleashes the optimal value of data. Gone are the days when an enterprise stores, transforms, manages, and analyzes all data in one box. Just as a carpenter has many tools to select from, a unified data architecture has several options to choose from so you can use the right tool for the job.

A unified data architecture should contain a data warehouse (one that has integrated and shared data environments to manage the business and deliver strategic and operational analytics to the extended organization), discovery analytics (to rapidly unlock insights from big data with rapid exploration using a variety of analytic techniques made accessible by mainstream business analysts), and data staging (loading, storing, and refining data before it’s analyzed).

What are the benefits of such an architecture?

When organizations put all of their data to work, they make smarter decisions and create a new, data-driven approach to improving their business. Through deeper insights about customers and operations, data delivers a competitive

Teradata’s UDA Expert Q&A Big Data Strategic Planning About Teradata

Page 6: Unified Data Architecture

5 TDWI e-book UnIfIeD DaTa archITecTUre

advantage when an organization can compete by leveraging analytics on all of its data. The opportunity to benefit from data is greater than ever before, but so are the challenges to figure out the best way to enable this capability.

What is Teradata Unified Data Architecture?

It is a new, data-driven approach to increase productivity, lower operating costs, and gain real competitive advantage. It’s a trusted and cost-effective framework for smarter data management, processing, and analytics that enables organizations to exploit all of their data. Teradata Unified Data Architecture is the only truly integrated analytics solution that unifies multiple technologies into a cohesive and transparent architecture—leveraging best-of-breed technologies to help companies create competitive advantage with their data.

What are the benefits of Teradata Unified Data Architecture?

Three benefits are key. First, the solution is truly integrated. Unlike alternatives, the Teradata Unified Data Architecture protects valuable data and aligns the best technology to the specific analytic need. We don’t run around with a hammer thinking every problem looks like a nail. Second, enterprises can access any and all data. Business analysts and other knowledge workers or end users can ask any question of any data at any time, so users are empowered to do more across the enterprise by simply and quickly unlocking new and valuable insights. Finally, it’s trusted by such industry leaders as Coca-Cola, eBay, and Ford.

How would a customer use an entire unified data architecture?

A large global bank was struggling with reducing churn in

profitable customer segments. Part of the problem the bank had in tackling this issue was integrating customer interaction data across multiple channels from numerous siloed repositories. The size of the data—billions of records per month—also made the analysis of this information a very complex exercise.

The bank wanted to proactively help detect and prevent churn among profitable customers and chose Teradata Unified Data Architecture; they built an enterprise view of all customer interactions with the bank and identified the most frequent

paths to account closure across all interaction channels. As a result, the bank reduced customer churn among profitable customers by five percent simply by identifying and then removing the event that was causing a high number of account closures.

They used Teradata’s integrated data warehouse (for historical customer transaction, profile, and product information), the Teradata Aster discovery platform (to analyze and discover customer interaction patterns over time to determine which actions were most likely to lead to an account closure), Hadoop (for loading, storing, and refining long histories of data), and Teradata Integrated Marketing Management (to make real-time decisions and offers to enhance customer satisfaction and deliver the right offers at the right time, preventing account closures and actually growing the customer relationship).

Does this mean Teradata is now advocating the sale and use of Hadoop?

Yes, but only for the right use case. The hype around Hadoop would have you believe that it will solve every problem in data management, which simply is not true. Our goal is to provide best-in-class technology and help our customers architect the most cost-effective solutions to support the business without sacrificing SLAs or data governance. Sometimes it includes Hadoop, other times it is not required, just like an integrated data warehouse is not ideally suited for every analytical workload.

In the Teradata Unified Data Architecture, we recommend Hadoop for capturing, storing, and refining large volumes of multi-structured data. We see the value Hadoop brings and wrap it with value-add enabling software and services for system management, monitoring, and support to help customers get the most value out of it as part of our Unified Data Architecture.

Teradata’s UDA Expert Q&A Big Data Strategic Planning About Teradata

Page 7: Unified Data Architecture

6 TDWI e-book UnIfIeD DaTa archITecTUre

Ignore the hype surrounding big data. What’s really important is to learn about the new models for data processing that big data is bringing so you can plan rather than react.

Big data isn’t hype, but it is being hyped. There is substance to the technology shift happening in the broader data management market of which business intelligence and big data are both a part. The real question to ask is,

“What’s different?”

The constant drone of the “three Vs of big data” we keep hearing in the media doesn’t explain much. This focuses on a literal interpretation of big data, explaining it in terms of bigness (except when the data isn’t big), variety (except when there isn’t any variety in the data), and velocity (except when the data is processed in batch). So you can have big data without any of the three Vs, making this an empty definition.

Big data implies big, but is it? Many people are using the technologies to process moderate volumes of data, perhaps in the same way we use ETL. “Big” isn’t necessarily a

euphemism for unstructured data, either. Much of that data is structured. It might be log files, but those logs are events, generally easy to map to a relational table. It may be text, variably structured data, or the simple rows and columns we’re used to.

The term implies that the shift is about data, but it’s equally about technology. One assumption is that big data technology equals Hadoop. It’s more than Hadoop. There are real-time data stores, processing and analytics engines, and streaming technologies for monitoring and processing data as it flows. Some are built on top of Hadoop or HDFS while others exist independently. What they usually share is an ability to be deployed in dynamically scalable configurations.

The reality is that big data is about new models for data processing. It isn’t some specific type of data, or huge volumes of data, or specific technology. It’s about applying new technologies to meet unfulfilled needs that (usually) can’t be met by the traditional data warehouse architecture.

What Big data is really aBoUtBy Mark Madsen

Teradata’s UDA Expert Q&A Big Data Strategic Planning About Teradata

Page 8: Unified Data Architecture

7 TDWI e-book UnIfIeD DaTa archITecTUre

The areas where a data warehouse has difficulty are analytic processing, some types of data processing and transformation, and timeliness of development.

Some analytic processing is possible in SQL. Because of this, many database and analytic tool vendors say there’s no need to change, or the answer is to use a more scalable database. Depending on the scale, the types of data, the user concurrency, and the algorithm, this may be true. It’s equally possible that one of these elements limits the use of a database, which pushes the data warehouse to the side, as the source of the data that has to be moved, transformed, and processed elsewhere.

There are areas of basic data processing that the data warehouse technology stack has trouble with. This is less a failure of the database than of the data integration tools and the architecture. At large scale, processing becomes slow or expensive (or both). If the data is text or has a complex data structure, the DI tools may be poorly suited to the work. We end up in a situation where both the processing and the storage can be a mismatch to the tools we have available.

A constraint on the data warehouse today is the lack of agility: the response to changes or the need for rapidly cycling models and uses, as with much exploratory or experimental analytics.

“Experimental” isn’t restricted to scientific processing. A/B or multivariate testing of landing pages on a website is a form of experimentation. So are test marketing campaigns and staged product launches. It’s a style of using information that means the data, models, and integrations need to be changed and updated rapidly—something the data warehouse was not designed for. It was designed for predictable, or at least bounded, uses that didn’t change significantly.

This is a constraint that is baked into the architecture. Models are static in the database, and both data integration and BI tools maintain mappings to the static database model. Each layer in the architecture is managed independently, usually by different people. The capabilities in one layer are generally not available to tools in the other layers. Any change requires coordination from top to bottom. All of this, and the need to model in advance, constrain the speed with which a data warehouse can react to change.

A data warehouse is good for storing the important data, and for delivery of information when the usage model is interactive query, as with BI tools and dashboards. It’s less well suited to the read-write activities of exploratory analysis, the data-intensive processing of analytic models, and high-volume, low-latency, real-time workloads.

These are the usage models that various big data technologies were designed to address. They are designed more for these unmet needs than they are for the conventional workloads of BI. Because of this, they lack key features such as robust query support and good data management, to the point of sometimes lacking concepts such as metadata and schemas that are taken for granted in the data warehouse world. The features are missing because they aren’t as important for non-BI uses.

Big data is about data processing and new usage models. As an architect or designer, it’s helpful to look at what’s different about the technologies available, the data being processed, and the uses. Big data isn’t a replacement for data warehousing, nor is it an island to be maintained separately. It’s part of the new IT environment in the same way data warehouses and BI were a new addition to the IT environment when there was only OLTP and batch reporting. We’re still in the early stages of the market, making today the time to learn about the changes that are coming. Otherwise, you’ll be reacting to changes instead of planning for them.

Mark Madsen is president of Third Nature, a technology consulting and market research firm focused on business intelligence, data integration, and data management. Mark is an award-winning architect and former CTO whose work has been featured in numerous industry publications. He is a principal author of Clickstream Data Warehousing (John Wiley & Sons, 2002) and frequently speaks at conferences and writes about business intelligence and emerging technology.

Teradata’s UDA Expert Q&A Big Data Strategic Planning About Teradata

Page 9: Unified Data Architecture

8 TDWI e-book UnIfIeD DaTa archITecTUre

Are organizations prepared to pilot new techniques around big data, then absorb those methods back into the existing infrastructure? In this interview, BI and analytics expert David Loshin discusses ways that enterprises can successfully adjust to big data—or any new tool or technology.

Loshin is president of Knowledge Integrity and a recognized expert in information management, information quality, business intelligence, and metadata management—topics about which he writes and speaks frequently. His latest book, The Practitioner’s Guide to Data Quality Improvement, looks at practical aspects of data quality management and control.

BI This Week: Are we as an industry rushing too quickly to embrace big data as a new source of information?

David Loshin: I am not sure that I would say that we are rushing to embrace big data. Rather, I might suggest that what we are classifying as “big data” analytics and management today reflects evolutionary steps applied to methods and techniques that have been in development over a relatively long period of time. There is at least a 20- to 25-year track record of some organizations taking advantage of parallel and distributed computing. It’s only now, with commodity hardware and software tools generally available at a low barrier to entry, that makes what we are calling “big data” somewhat attractive and appealing.

What I question is whether organizations are properly prepared to both pilot and try out new techniques, and then absorb those methods back into the existing infrastructure. It’s one thing to download Hadoop and build the environment, and even load the data or run some analyses. It’s another thing to be able to leverage the existing business intelligence and analytics architectures and frameworks and meld the old with the new to enable business customers to exploit actionable knowledge.

hoW strategic planning can help Manage Big dataBy Linda Briggs

Teradata’s UDA Expert Q&A Big Data Strategic Planning About Teradata

Page 10: Unified Data Architecture

9 TDWI e-book UnIfIeD DaTa archITecTUre

Why do we need new types of analysis for big data instead of what we’re already using?

Interestingly, a large part of what is being done in the big data world is not really new; organizations using these technologies are often looking to improve the performance of analyses they are already doing. However, they want to scale up—to analyze bigger data sets, add more data sources, or do their analyses faster, so these types of organizations benefit from using the operating and programming models that parallelize computation, distribute data, or both.

That being said, though, there are numerous instances of companies looking to do some new types of analyses—for example, graph analytics algorithms can answer questions that the standard relational database management system struggled with. As more companies expand their user base beyond the so-called “C-level” users into a more mixed-workload community of users, the need for scalability and performance will put pressure on the IT teams to provide high-performance solutions.

If enterprises want to introduce new tools and technologies for dealing with big data, what’s the best way to do that company wide?

A different way of asking that question would be from a more general approach—if you want to introduce any new tool or technology, how do you engineer new technology testing, assessment, and integration, as well as abandonment, as part of the general system development life cycle?

We often have a culture of conflicts—the IT teams want to embrace and try out new technologies but cannot get the business to buy in, so they resort to a “skunk works” approach off to the side, which drains resources without necessarily providing visible results. At the same time, business users frustrated with an apparent lack of innovation will bypass the standard processes and implement new tools via a “shadow IT” approach. Both approaches are prone to failure and confusion.

An entirely different method, and one that we have seen work, is the specific allocation of time and resources to evaluating and piloting new technologies. That means

partnering with business users to try to solve real business problems, and it’s done knowing full well that 90 percent of those pilots are destined for failure. However, when one approach in the mix proves to add value, at least potentially, there must be well-defined processes for technology mainstreaming that maps the architecture and integration plan for incorporating the new.

What about issues with oversight and governance—of data, of technology, and of projects? Is the rush to embrace big data sidestepping all of that?

Of course, we want to insist on some level of governance of the process, but on the other hand, we don’t want to stifle creativity by imposing too much bureaucracy. There must be a common ground in which those engaged in speculative design and development using new technologies like big data are still accountable from a programmatic standpoint, with defined deliverables and milestones, as well as some level of assurance that data quality and usability expectations are not being ignored. At the same time, as I said before, there has to be a process for mainstreaming successful pilot projects back into the “blessed” enterprise architecture. At that point, the systems would have to be realigned into all the governance hierarchies.

What is the best way to align an analytics program with its business consumers? Too often, these sorts of programs seem to be driven by IT—why is that, and what is the correction?

This is not a new challenge. At Knowledge Integrity, the best way we’ve found to address this issue is using a holistic approach to aligning business value with technical initiatives. We work with the business partners to identify how their business processes can be enhanced using different approaches to analytics. The goal is to review the dependencies on information and expectations for mapping analytic results to increased profitability (either through increased revenues or decreased costs), reductions in risk, and enhancement to the customer experience.

Teradata’s UDA Expert Q&A Big Data Strategic Planning About Teradata

Page 11: Unified Data Architecture

10 TDWI e-book UnIfIeD DaTa archITecTUre

If the costs associated with introducing new approaches to analytics can be justified in terms of potential lift, the business consumers will generally support the activity.

How can companies adhere to existing business objectives and budgets, which are often crafted well ahead of time, while still remaining agile and open to trying new technologies?

I wish there were an easy answer to this, since, as you say, budgets are often configured long in advance of the actual work being done, without a lot of expectation for absorbing new technology. We actually have guided our clients to introduce R&D budget line items specifically for researching, evaluating, and trying out new technologies, but we also caution them to ensure that these tasks are still aligned with business needs (as per the answer to your last question), as well as having the expectation that 90 percent of what is being tested will never make it to see the light of day.

In regard to new technologies and challenges, such as big data management or analytics, how can organizations “plan for change” in general? What mistakes do you see clients making in this area?

I would say, try to maintain a degree of flexibility that can enable your infrastructure to embrace change instead of being resistant to it. That idea has to span people, processes, and technology, by the way. One interesting way to do this might seem somewhat counterintuitive. That is, to insist on defining, adhering to, and enforcing standards: modeling standards, process standards, data standards, and so forth. One of the biggest challenges in planning for change is figuring out what needs to change.

All of this refers back to some of my earlier points: an environment that is well governed will ultimately be much more adaptable, and that will enable flexibility and agility throughout the enterprise.

Linda L. Briggs writes about technology in corporate, education, and government markets for TDWI. She is based in San Diego.

Teradata’s UDA Expert Q&A Big Data Strategic Planning About Teradata

Page 12: Unified Data Architecture

11 TDWI e-book UnIfIeD DaTa archITecTUre

teradata.com

Teradata is the world’s largest company focused on integrated data warehousing, big data analytics, and business applications. Our powerful solutions portfolio and database are the foundation on which we’ve built our leadership position in business intelligence and are designed to address any business or technology need for companies of all sizes.

Only Teradata gives you the ability to integrate your organization’s data, optimize your business processes, and accelerate new insights like never before. The power unleashed from your data brings confidence to your organization and inspires leaders to think boldly and act decisively for the best decisions possible. Learn more at teradata.com.

• www.teradata.com/uda

tdwi.org

TDWI, a division of 1105 Media, Inc., is the premier provider of in-depth, high-quality education and research in the business intelligence and data warehousing industry. TDWI is dedicated to educating business and information technology professionals about the best practices, strategies, techniques, and tools required to successfully design, build, maintain, and enhance business intelligence and data warehousing solutions. TDWI also fosters the advancement of business intelligence and data warehousing research and contributes to knowledge transfer and the professional development of its members. TDWI offers a worldwide membership program, five major educational conferences, topical educational seminars, role-based training, on-site courses, certification, solution provider partnerships, an awards program for best practices, live Webinars, resourceful publications, an in-depth research program, and a comprehensive website, tdwi.org.

©2013byTDWI(TheDataWarehousingInstituteTM),adivisionof1105Media,Inc.Allrightsreserved.Reproductionsinwholeorinpartareprohibitedexceptbywrittenpermission.E-mailrequestsorfeedbacktoinfo@tdwi.org.

Productandcompanynamesmentionedhereinmaybetrademarksand/orregisteredtrademarksoftheirrespectivecompanies.

Teradata’s UDA Expert Q&A Big Data Strategic Planning About Teradata