Columns that are only referenced by their name (without the preceding table name) will NOT be detected.Please let me know if there are areas that I’ve missed and I’ll see if I can include them as well. Filters (on visual, page, report and drillthrough-level).The areas covered for usage detection are the following: This is intentional and reflects the purpose of the tool to be used as a cleanup-helper and not as a documentation tool. So if a column or measure has been used in another column or measure this will not be shown as long as those elements haven’t been used anywhere in the report. Please note that the drill through pages will only return values for columns and measures that have been used or referenced by used items (directly or indirectly). This tool shows the usage of all columns (and measures) within the tables of the VertiPaq Analyzer.Įlements that are indirectly used at a later stage Luckily, this can now be answered with my new Power BI Cleaner tool. So when seeing expensive columns, the first question that arises is: “Do I really need this column or could I delete it?”. It gives me a great overview of all elements in my model and identifies potential performance problems by showing the storage requirements of each column. The VertiPaq-Analyzer tool is one of the great community tools that I really cannot live without. Please post issues here: Įdit 27th June 2021: There is a complete new version of this tool that has all the bugfixes and enhancements here: Power BI Cleaner – The BIccountant Power BI Cleaner is now published with ‘Power BI Sidetools’ by Didier Terrien: Didier may provide help in some cases. Please feel free to use this tool as it is, but I am not fixing any bugs or provide further help for it. You can modify as needed.Edit: I am retiring support for this tool for now, as you can find an alternative here: Measure Killer | Brunner BI. That way you know there was no match in your key word table. You can then add a custom column in power query and just pass the "bad name" as the function parameter, it will return the "good name" or Company Name in this case if there's a match, otherwise it keeps the old company name and adds a "*" to the front of the name. You just need a Table in a workbook named "Co_KeyWords" with the following columns: "KW ID", "Key Word", "Company Name". ParamSource = Excel.CurrentWorkbook(),"Company Name")in Value This was then easy to identify the key word, then create the new Clients as each week's report would only add a few names if any at all. As Ian mentioned, I had a "quality" query that would check for "Bad Names" with Blank Client names. It would look in an Excel file list of "Key Words" and return the standardized Client names. To solve, I created a function that used the a custom function. I needed to strip out the Company Name so they were consistent string values. The sales people had free-form ability to type in this field, so it was a real mess. I had a similar issue with a CRM database which had Company (Customer), Location, and some Job Details all mashed together in one field. Yes I would agree with Matthew, fuzzy matching is a great place to start.Īlso forgot to say that, ideally, we should aim to clean the data in each source system over time as well and try to implement some data validity checks (if possible) in the source systems to stop the problem recurring.Īll good solutions / advise from Stuart and Perren. Having used fuzzy matching in both excel and Power Query I'd say that the PQ version is better. There's still a certain amount of manual checking as the match is.well.fuzzy but the bulk of the work is automatic and just requires verification. Not sure if that's helpful but it saved me tons of time in trying to map thousands of product records. However I used fuzzy matching to do the initial mapping. I've used the supplier lists to produce a lookup list as Ian as described. I have a common item list in Business Central and need to map it against supplier lists. I have a similar problem but with item descriptions. I currently clean up the names into one common name manually in Excel and then load it to Power BI. Hi, can anyone recommend a software to clean up vendor names which come through the various data sources that I use? Or a method to do this in Power BI. There are other variations on this methodology. This can be run as a separate data quality report and send you an alert too so you have a chance to fix it before the masses see the result. If you build something that looks like this (and it can be Excel) and maintain it you can use it as a look up to return the proper name.Īnything that doesn't get matched comes back with "Unknown Vendor" which tells you that you that you need to update your master data. I wouldn't procure any software, it's a common problem that can be solved with central "master data".
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |