As I have already explained in my CMG paper (see IT-Control Chart), the data that describes the IT-Control Chart (or MASF control chart) has actually 3 dimensions (actually, it has 2 time dimensions and one measurement - metric as seen in the picture at the left). And the control chart is a just a projection to the 2D cut with actual (current or last) data overlaying. So, naturally, the OLAP Cubes data model (Data Cubes) is suitable for grouping and summarizing time stamped data to a crosstable for further analysis including building a control chart. In the past SEDS implementations I did not use Cubes approach and had to transform time stamped data for control charting using basic SAS steps and procs. Now I found that Data Cubes usage is somewhat simpler and in some cases does not require a programming at all if the modern BI tools (such as BIRT) are used.
Below are the some screenshots with comments that illustrates the process of building the IT-Control Chart by using BIRT Cube.
Data source (Input data) is a table with date/hour stamped single metric with at least 4 months history (in this case it is the CPU utilization of some Unix box). That could be in any database format; in this particular example it is the following CSV file:
The result (in the form of BIRT report designer preview) is on the following picture:(Where UCL – Upper Control Limit; LCL is not included for simplicity)
Before building the Cube the three following data sets were built using BIRT “Data Explorer”:
(1) The Reference set or base-line (just “Data Set” on the picture) is based on the input raw data with some filtering and computed columns (weekday and weekhour) and (2) the Actual data set which is the same but having the different filter: (raw[“date”} Greater “2011-04-02”)
(3) To combine both data sets for comparing base-line vs. actual, the “Data Set1” is built as a “Joint Data Set” by the following BIRT Query builder:
Then the Data Cube was built in the BIRT Data Cube Builder with the structure shown on the following screen:
Note only one dimension is used here – weekhour as that is needed for Cross table report bellow.
The next step is building report starting with Cross Table (which is picked as an object from BIRT Report designer “Pallete”):
The picture above shows also what fields are chosen from Cube to Cross table.
The final step is dropping “Chart” object from “Palette” and adding UCL calculation using Expression Builder for additional Value (Y) Series:
To see the result one needs just to run the report or to use a "preview' tab on the report designer window:
FINAL COMMENTS
- The BIRT report package can be exported and submitted for running under any portals (e.g. IBM TCR).
- Additional Cube dimensions makes sense to specify and use, such as server name or/and metric name.
- The report can be designed in BIRT with some parameters. For example, good idea is to use a server name as the report parameter.
- To follow the “SEDS” idea and to have the reporting process based on exceptions, the preliminary exception detection step is needed and can be done again within a BIRT report using the SQL script similar with published in one of the previous post:
I would like to repeat the same exercise (Cube usage for Control charting) against the same data but stored in some MySQL table. Plus in opposed to non-programming approach I am interested in developing some SQL script to do the same data transformation and then to chart using BIRT or R.
ReplyDeleteLastly my plan is to do it all using R meaning to develop some R based open source type of application (SEDS-lite) to
- Connect to database (MySQL as the test example)
- Filter out exceptions
- For each exception to transform data for Control charting
- Build control charts
- Put the list of exceptions and control charts on a web report.
Any help, comments or contribution offering are very welcome.
Hi Igor, I've been trying to replicate this and am having some trouble. Do you actually compute the Standard Deviation while constructing the Data Cube? or are you using just the mean to calculate the UCL?
ReplyDeleteNo, I use STDDEV aggregation function against expression measure["Data Set::% CPU Used1"] for calculated field in the cross table withing the report itself. I also calculate UCL as a part of chart building. Nothing needed like that at the cube building level. Maybe it is possible...
DeleteBTW I plan to convert that (and other) post to Video presentation to make that more understandable (I have already published one on youtube: http://itrubin.blogspot.com/2011/09/power-of-control-charts-and-it-chart.html
Greetings Noble Igor - I notice in your SQL code you are comparing last 7 days to the last 180 days (where the larger is your basis for a reference set). I think you should not include the last 7 days in the 180 day set because: (1) there could be outliers in recent data that will affect means and std, and (2) by including it you are comparing it to itself as a subset of the larger data (like autocorrelation).
ReplyDeleteChange baseline select to something like this:
ReplyDeleteWHERE DATE > (CURRENT DATE - 7 DAYS) - 180 DAYS
If you have identified and kept outliers in a separate table then:
WHERE DATE > (CURRENT DATE - 7 DAYS)-180 DAYS AND
DATE NOT IN
(Select DATE from OUTLIER_TABLE
WHERE DATE > CURRENT DATE - 180 DAYS)
Tim - Your both comments are valid and in the real implementations I do exactly what you pointed plus some other thinks... In this and other posts I have simplified some steps just to not scary potential readers/users. Thank you for you notes. They are good compliment to my posts.
DeleteThis is really nice post, I found and love this content. I will prefer this, thanks for sharing. Data Cleaning Service.
ReplyDeleteI wanted to thank you for this excellent read!! I definitely loved every little bit of it. I have you bookmarked your site to check out the new stuff you post. Lebensmittel
ReplyDeleteHello I am so delighted I located your blog, I really located you by mistake, while I was watching on google for something else, Anyways I am here now and could just like to say thank for a tremendous post and a all round entertaining website. Please do keep up the great work. survey data entry
ReplyDeleteYour writing is fine and gives food for thought. I hope that I’ll have more time to read your articles . Regards. I wish you that you frequently publish new texts and invite you to greet me data entry bookkeeping
ReplyDeleteThis comment has been removed by a blog administrator.
ReplyDeleteThis comment has been removed by a blog administrator.
ReplyDeleteNice post! This is a very nice blog that I will definitively come back to more times this year! Thanks for informative post. data entry bookkeeping
ReplyDeleteOne in the types is information systems that generally programmed on a mainframe, minicomputer, microcomputer or personal computer. Benefits of Data Entry Outsourcing - Outsourcing give benefits you financially and also strategically. bookkeeping data entry
ReplyDeleteOne in the types is information systems that generally programmed on a mainframe, minicomputer, microcomputer or personal computer. Benefits of Data Entry Outsourcing - Outsourcing give benefits you financially and also strategically. bookkeeping data entry
ReplyDeleteSo actually, when you join and pay the enrollment charge for these Data Entry Projects, you will get the preparation materials that will show you how to type more 'data entry advertisements' and persuade others to do something very similar.database data entry services
ReplyDeleteDespite technological improvements, business organizations still rely on various data entry systems and data entry vendors accuracy is vital in such cases. Bad data may come from various sources and it is important to sort your data before any entry operations.
ReplyDeleteWow, What a Excellent post. I really found this to much informatics. It is what i was searching for.I would like to suggest you that please keep sharing such type of info.Thanks here
ReplyDeleteMany industries use data science in order to automate different tasks. Businesses use historical data for training their machines to do repetitive tasks. And this is what simplifies arduous jobs done by humans a few years back. data science course in hyderabad
ReplyDeleteA debt of gratitude is in order for sharing the information, keep doing awesome... I truly delighted in investigating your site. great asset... receipt data entry
ReplyDeletePositive site, where did u come up with the information on this posting?I have read a few of the articles on your website now, and I really like your style. Thanks a million and please keep up the effective work. data entry ecommerce
ReplyDeleteThis substance is composed exceptionally well. Your utilization of arranging while mentioning your focuses makes your objective facts clear and straightforward. Much obliged to you. images data entry
ReplyDeleteit's really cool blog. Linking is very useful thing.you have really helped crisis management training
ReplyDeleteThanks for a very interesting blog. What else may I get that kind of info written in such a perfect approach? I’ve a undertaking that I am simply now operating on, and I have been at the look out for such info. Best screw driver kit online
ReplyDeleteInvitation Management: Comprehensive, easy to use integrated invitation management tool, Internet of things tech events
ReplyDeleteWe exceptionally demoralize any programming for client ventures as it nullifies the point of an out-of-the case arrangement. We encourage clients to move toward any programming increments with alert. besimple.com/
ReplyDeleteExcellent Blog! I would like to thank for the efforts you have made in writing this post. I am hoping the same best work from you in the future as well. I wanted to thank you for this websites! Thanks for sharing. Great websites!
ReplyDeletedata science course in India
You have outdone yourself this time. It is probably the best, most short step by step guide that I have ever seen. Legacy data archiving
ReplyDeleteWow! Such an amazing and helpful post this is. I really really love it. It's so good and so awesome. I am just amazed. I hope that you continue to do your work like this in the future also.
ReplyDeleteArtificial Intelligence Course
A Non-Disclosure Agreement (NDA) will be marked or Corporate and Business Clients to guarantee privacy of information from accommodation of media to information conveyance.data-recovery-tips.co.uk
ReplyDeleteIf you have just a few people to verify, the online option is the better choice. If you need to verify a large quantity, you'll need to create a file and mail it in.visit website
ReplyDeleteE-learning permits offering types of assistance remotely. E-banking permits customers to utilize bank's services at whatever point they need without visiting bank's office. IT company Hamilton
ReplyDeleteI just want to let you know that I just check out your site and I find it very interesting and informative.. The Best Remote Team Management Tool
ReplyDeleteYou bear through a awesome vacancy. I sanity definitely quarry it moreover personally suggest to my buddys. I am self-possessed they determination be benefited from this scene. spss data analysis help
ReplyDeletethat critical time, the software sends to one of your marketing executive an alert for such a prospect who, from there on, can take predefined steps to ensure the prospect turns into a conversion. www.updigital.ca
ReplyDeleteinstagram takipçi satın al
ReplyDeletecasino siteleri
MV30Q
çekmeköy
ReplyDeletekepez
manavgat
milas
balıkesir
SS1
bayrampaşa
ReplyDeletegüngören
hakkari
izmit
kumluca
8QVYA
yurtdışı kargo
ReplyDeleteresimli magnet
instagram takipçi satın al
yurtdışı kargo
sms onay
dijital kartvizit
dijital kartvizit
https://nobetci-eczane.org/
VHO
https://istanbulolala.biz/
ReplyDelete5DJF
muş evden eve nakliyat
ReplyDeleteçanakkale evden eve nakliyat
uşak evden eve nakliyat
ardahan evden eve nakliyat
eskişehir evden eve nakliyat
JHPY
muş evden eve nakliyat
ReplyDeleteçanakkale evden eve nakliyat
uşak evden eve nakliyat
ardahan evden eve nakliyat
eskişehir evden eve nakliyat
WX8M
urfa evden eve nakliyat
ReplyDeletemalatya evden eve nakliyat
burdur evden eve nakliyat
kırıkkale evden eve nakliyat
kars evden eve nakliyat
GOWHV6
ADD67
ReplyDeleteMuğla Lojistik
Denizli Parça Eşya Taşıma
Vindax Güvenilir mi
Tekirdağ Boya Ustası
Hakkari Evden Eve Nakliyat
Silivri Fayans Ustası
Antalya Rent A Car
Kırıkkale Şehirler Arası Nakliyat
İzmir Evden Eve Nakliyat
FB4CD
ReplyDeleteGümüşhane Lojistik
Rize Lojistik
Bursa Şehir İçi Nakliyat
Zonguldak Evden Eve Nakliyat
Sivas Parça Eşya Taşıma
Aksaray Lojistik
Afyon Evden Eve Nakliyat
Siirt Parça Eşya Taşıma
Muş Şehir İçi Nakliyat
42A20
ReplyDeleteresimli magnet
binance referans kodu
binance referans kodu
binance referans kodu
resimli magnet
binance referans kodu
referans kimliği nedir
referans kimliği nedir
resimli magnet
77145
ReplyDeletemercatox
mexc
kraken
paribu
kripto para telegram
kucoin
canlı sohbet uygulamaları
binance
telegram kripto grupları
BB3D3
ReplyDeletebtcturk
mobil 4g proxy
mercatox
canlı sohbet siteleri
filtre kağıdı
papaya
bitexen
huobi
referans kimliği