||View Job Listings|
||Post a job|
What are the best practices when it comes to deciding between a mainframe and NT environment for a data warehouse.
What are the best practices when it comes to deciding between a mainframe and NT environment for a data warehouse? We have a mainframe in house for OLTP and can use it. Are there ETL tools on a mainframe?
David Marco's Answer:: There are several ETL tools available on the mainframe. Evolutionary Technologies International (ETI) has the leading tool in this environment. Informix owns a tool called Prism Warehouse Executive (PWE) that also works well on a mainframe.
Doug Hackney's Answer: The greatest challenges in the mainframe environment are cultural. A data warehouse is a relatively freewheeling environment with multiple roles needing the capability to create and drop temporary and permanent table - including users. Mainframe environments are also challenged to deliver sustained high performance to DW tasks at the most critical times (i.e., month, quarter and year-end) as BI is always the lowest priority behind OLTP jobs. If you can gain dedicated processing power and overcome the imbedded cultural biases, mainframes have plenty of power to do the job. Your choices of ETL tools will be limited to first generation code generators such as ETI and Prism/Ardent/Informix.
Chuck Kelley's Answer: In DM Review magazine back in May, 1994 there was an article by Bill Inmon and myself that described the 12 defining characteristics of a data warehouse. The first characteristic is that they are separate but equal. In that sense, I would recommend (based on your two options, though there are more) using the NT environment instead of the mainframe that is also hosting the OLTP system. I realize there are extra cycles available on the mainframe, but how do you tune an operating system to do both a) 5 reads, 7 writes - typical OLTP system, and b) read a million rows and aggregate? If you are running MVS, you can run two "copies" of MVS on the same machine, but that is, in effect, the same as getting another environment.
Now it would be very wise to use those cycles to do the ET part of the ETL process. There are tools that run on the mainframe. The ones that hit my mind are Warehouse Executive by Ardent (www.ardent.com) and Tapestry by D2K (www.d2k.com). I know that there are more.
Sid Adelman's Answer: There are more tools available for NT than for the mainframe, and these tools have usually had more field exposure and testing. Tools for NT will most likely be less expensive, but the mainframe is far more scalable, has better availability characteristics and has better monitoring tools. Be sure to look at the skills available to you. This could be the deciding factor.
For more information on related topics visit the following related portals...
DW Design, Methodology,
DW Servers and
David Marco is an internationally recognized expert in the fields of enterprise architecture, data warehousing and business intelligence and is the world's foremost authority on meta data. He is the author of Universal Meta Data Models (Wiley, 2004) and Building and Managing the Meta Data Repository: A Full Life-Cycle Guide (Wiley, 2000). Marco has taught at the University of Chicago and DePaul University, and in 2004 he was selected to the prestigious Crain's Chicago Business "Top 40 Under 40." He is the founder and president of Enterprise Warehousing Solutions, Inc., a GSA schedule and Chicago-headquartered strategic partner and systems integrator dedicated to providing companies and large government agencies with best-in-class business intelligence solutions using data warehousing and meta data repository technologies. He may be reached at (866) EWS-1100 or via e-mail at DMarco@EWSolutions.com.
Douglas Hackney is the president of Enterprise Group Ltd., a consulting and knowledge-transfer company specializing in designing and implementing data warehouses and associated information delivery systems. He can be reached at www.egltd.com.
Chuck Kelley is a senior architect in the business intelligence practice for Hitachi Consulting (www.HitachiConsulting.com), a globally recognized leader in delivering value-based business and IT Solutions. Kelley is an internationally known expert in database and data warehousing technology. He has 30 years of experience in designing and implementing operational/production systems and data warehouses. Kelley has worked in some facet of the design and implementation phase of more than 50 data warehouses and data marts. He also teaches seminars, co-authored three books on data warehousing and has been published in many trade magazines on database technology, data warehousing and enterprise data strategies. He can be contacted at email@example.com.
Provided by IndustryBrains
|Data Validation Tools: FREE Trial|
Protect against fraud, waste and excess marketing costs by cleaning your customer database of inaccurate, incomplete or undeliverable addresses. Add on phone check, name parsing and geo-coding as needed. FREE trial of Data Quality dev tools here.
|Backup SQL Server or Exchange Continuously|
FREE WHITE PAPER. Recover SQL Server, Exchange or NTFS data within minutes with TimeSpring?s continuous data protection (CDP) software. No protection gaps, no scheduling requirements, no backup related slowdowns and no backup windows to manage.
|Manage Data Center from Virtually Anywhere!|
Learn how SecureLinx remote IT management products can quickly and easily give you the ability to securely manage data center equipment (servers, switches, routers, telecom equipment) from anywhere, at any time... even if the network is down.
|Design Databases with ER/Studio: Free Trial|
ER/Studio delivers next-generation data modeling. Multiple, distinct physical models based on a single logical model give you the tools you need to manage complex database environments and critical metadata in an intuitive user interface.
|Free EII Buyer's Guide|
Understand EII - Trends. Tech. Apps. Calculate ROI. Download Now.
|Click here to advertise in this space|