Digital Transformation for the Average Contractor (Part 4)

This is my 4th and last article on Digital Transformation for the average contractor. The whole point of this series was to help companies understand that they don’t need to know what the future holds to prepare for it. If you missed them, the other 3 articles can be found here…

In this article, we’ll look at an action plan. By using this plan, you can help focus your efforts.

A 5 Step Action Plan

It can help to prioritize with any effort. Because there’s a lot to do, it’s easy to get lost in all the work. Your action plan may look different and it should be if your needs are different. However, these 5 steps are a good starting point.

Step 1 – Mitigate Current Risks

You might have existing risks because of prior actions. You can only mitigating these risks if you identify them. Review what you’re currently doing because that’s likely where they are. However some risks may be due to what you’re not doing. Here’s some ideas to get you thinking….

  • Is your data backed up? Not just server data but cloud systems, machine tools, etc.
  • Are you managing user accounts in all your technology systems?
  • Do your advanced or complex configurations have documentation?
  • Is your data accurate (BIM Content, Models, Standards, etc.)?
  • Are users trained in proper processes and technology usage?
  • Do all your technologies have an “owner” or responsible party?
  • Is Staff cross trained or does your technology and processes rely on only one person?
  • Who is maintaining your standards? Is there even governance around them?
  • Are there things critical to your organization controlled by others outside your organization?
  • What things have “Single points of failure”?

Step 2 – Reduce Waste and Inefficiencies (Create Value)

Your next step is really something you likely do already. Reducing waste and inefficiency. However it’s a good idea to revisit occasionally. After you’ve documented your workflow, developed a new workflow or changed your technology. It’s good to revisit how these things impact your efficiency and drive value. Some general thoughts that can apply to most company…

  • Are users aware of how your technology should be used (training)?
  • Do you have under utilized or misused technology or processes?
  • Duplicate technology for the same purpose?
  • Are there things you do that are easily outsourced?
  • What can be automated but isn’t?
  • Do new cloud workflows represent what should happen or did you simply move your existing processes into the cloud?
  • Are your computers or hardware setup consistently, maintained proactively or built with automated processes?

Step 3 – Missed Opportunities

One area people don’t think about enough is missed opportunities. You’re always watching costs vs benefits. Results of doing things. But what’s the cost of not doing something? What’s the cost of missed opportunities? These could come in many forms. It’s best to build your own list but here’s a few examples…

  • Leverage knowledge from existing staff
  • Free or joint marketing from vendors or customers
  • Missed value you could sell if you were leading edge with technology
  • R&D opportunities with technology vendors
  • Existing competencies not marketed properly

Step 4 – Prepare for the Future

This step is really what Part 2 and Part 3 of my series was about. These are things you can do now, despite an unknown future. There’s a lot you can do right now that sets you up for success down the road. However you don’t need to wait until the first three steps are done. You can start chipping away at these now. They just shouldn’t be your primary focus until Steps 1 through 3 are well underway. Here’s a few ideas, feel free to add your own…

  • Document existing processes
  • Develop ideal workflows
  • Start building missing competencies in staff and departments
  • Restructuring existing technology stacks
  • Capture wisdom of staff nearing retirement
  • Reverse mentor older staff by tech savvy younger generation

Step 5 – New Strategic Opportunities

This step is the hardest to provide guidance on. It really depends where your company sees itself going. The skills they have can help differentiate itself in the market place from others.

Here’s were an ear to the ground can be helpful. Trying to anticipate what trends in construction may be fads vs long lasting or even transformational. How can you leverage them? How can you change your business to remain relevant? Regardless of the future, if you’ve followed all the other guidance, you should be able to easily adapt when these trends emerge.

  • Will you be a manufacturer in an “Industrialized Construction” economy?
  • Do you have staff capable of developing prefabrication or modularization strategies?
  • Are you able to be an efficient supply chain provider?
  • Can you help your customers with smart building technology?
  • Will Machine Learning or Artificial Intelligence render you obsolete or is it simply a tool you use?
  • Who are the new players in the construction economy and is there value you can bring to them?
  • How can you capitalize on the struggles of your competition?

Summary

Aside from here in Step 5, everything in this series of posts are things you can start doing right now. They’re things that don’t require a prognosticator’s view of the future. Yet they’re all things that will help you be more agile and able to adapt when trends or disruption comes knocking.

You don’t have to worry about the future to prepare for it. There’s enough to do right now that you can stop worrying. More than likely, at some point you’ll take notice and see you’re living the future. The actions and choices you’ve used to prepare you allowed you to tackle the future without even realizing it.

Make smart choices. Stay busy. Stay Relevant. You can eat the future one small bite at a time.

Digital Transformation for the Average Contractor (Part 3)

I discussed the problem and overall objective of digital transformation in construction in Part 1. In Part 2 I outlined four activities you could take right now. Activities that cost nothing more than time. These activities can really help inform you. Guiding where you should start working when aligning your technology stack.

There’s a lot of things you can do to better align your technology. While there’s no magic formula there are a few categories these activities fall under.

1 – Technology Removal

Photo credit: Kyle Slattery License: CC BY-NC-SA 2.0

In many cases, you might be removing technology from your portfolio. Maybe it’s obsolete or ineffective. Whether your processes have changed or the technology didn’t stay mature it’s best to remove things that are no longer needed or don’t provide the value you were looking for.

It could also be that there’s overlap in solutions. Does more than one product serve the same purpose? You typically don’t want more than one solution for the same problem. There can always be exceptions. But you should have a very good reason if you have duplicate technology.

2 – Technology Realignment

Photo credit: Wikimedia Commons License: Public Domain

In many cases, you already have good tech in place. However you may not be using it correctly or to it’s potential. This is often a result of someone solving a specific problem and buying a product to address it. This can result in technology you haven’t fully implemented or is implemented poorly.

This doesn’t mean the solution is bad or that the effort was bad. But it can be helpful to revisit. Is there more value you can leverage? More of your processes and workflows covered? Can you use the product differently to achieve a greater purpose?

When you realign existing tech, it often just takes time. Time to relook at the factors that led to it’s use. Relook at how things have changed. Changes in your process as well as how the product may have matured since first selected.

Can you change your process to better accommodate the product’s value proposition? Can you change how you use the product to better serve your needs? Here’s another case where it only takes time if you have a good tech savvy person in house.

It might also be wise to leverage a vendor or consultant to help. You might also consider leveraging vendors for training. They have experience with other firms using their tools. With their knowledge, they can often can point out use cases you might not have thought of.

The other cost during realignment of technology is licensing. If you’ve under or miss utilized technology, fixing this problem may result in additional use. This translates to additional licensing costs for you. This isn’t bad. It’s good. The whole point is to gain value and productivity. Both of those things should be worth paying for. If not, it’s a sign you’re using the wrong tool.

3 – New Technology

Photo credit: Pixabay License: Simplified Pixabay License

When it comes to new technology, the number of choices can be overwhelming. You may not know what the future holds. And every vendor claims they’ll lead you there.

The fact is, you don’t need to be a prognosticator to choose good technology. There’s a number of basic concepts and criteria you can use when evaluating technology. Concepts that help you make better choices regardless of what the future holds.

When I look at technology, there’s a number of questions I ask myself about a potential solution. Here’s a partial list of things take into consideration. There is no right or wrong answer. They won’t all be true. But you can get an idea if you have a good solution or not based on these and other factors. When using these criteria, your choice will likely be better suited to the future even if it is unknow. Make your own list or add to this one….

  • Is it cloud based or enabled?
    Most things are migrating to the cloud. If it’s an on premises only solution, it’s not aligned with the future as well as a Cloud based solution.
  • Does it reduce or eliminate paper?
    Anything that reduces or eliminates paper will help reduce static obsolete data.
  • Will it reduce or eliminate data files?
    Much like paper, data files are typically copies of the real data in a system. Think of a PDF, it’s really electronic paper. Give preference to anything that gives you real time access to data without needing “files”.
  • Is duplication of data reduced or eliminated?
    Data duplication is never good. It requires extra effort to keep in sync or find out why it’s different. Find solutions that reduce data duplication in your environment.
  • Will it simplify or eliminate processes?
    If it makes things simpler, there’s less waste. Less training. Less things to go wrong.
  • Does it simplify IT infrastructure?
    Your IT infrastructure can often be impacted by technology. Solutions that simplify your infrastructure can often be an added benefit.
  • Is it Model based?
    Can the solution leverage your BIM or CAD models? Not just export data from them or convert them but use them directly to provide value? A Model Based enterprise is what you should be striving for.
  • Can you integrate it with other solutions?
    If you can’t integrate with anything else, you’re boxed in. Even if you don’t have anything currently to integrate with, you typically want that ability later should you need it.
  • Does it have an API (Application Programming Interface)
    Without an API, you can’t automate anything like data mining or integrating with other solutions. Even if you don’t have a programmer, you may want to use one later. Don’t limit your future options without a good reason.
  • If it doesn’t meet some of your criteria, does it get you closer?
    Sometimes we just can’t get everything we want. It might be too much of a change or maybe it’s just not available. But can you get closer? Don’t over look incremental improvements.
  • Digital twin?
    Does the solution get you closer to having a digital replica of your product, facility and/or process?
  • Has manufacturing used a similar solution?
    Manufacturing has led trends seen in construction by two or three decades. BIM in AEC is similar to PLM (Product Lifecycle Management) in Manufacturing. 3d Parametric Modeling? LEAN? Model Based Enterprise? Prefabrication and Modular construction use similar concepts as DfM, DfA or DfMA in manufacturing that helped re-shore a lot of work that was once offshored.
  • Have other industries used similar solutions or concepts?
    Shipbuilding has transformed to use modular. Healthcare leveraged a lot of Toyota’s Lean concepts. GIS uses CAD data linked to external data sets. A lot can be learned from watching others.
  • Is time eliminated from your process?
    Any solution should put time back in someone’s day or reduce lead times.
  • Does it build in quality?
    If using the solution, will it help mistake proof your processes?
  • Will the solution require a dedicated administrator?
    Many solutions sound good on the surface but require a lot of administrative overhead. Verify the cost of administration when looking at any technology.
  • Is data better organized?
    Will using the solution help better organize your data and information? Data is of no value if others can’t find what they need.
  • Can it leverage or use existing data?
    You have a lot of data already. Can it leverage or use what you already have and provide more value to an existing data asset?
  • Does it turn data into information?
    Data is worthless. Information is priceless. Make sure any solution provides information, not just data.
  • What % of existing data/systems is being used?
    How much of what the solution offers is actually going to be used or helpful? Features don’t provide value if they’re not going to be used or helpful.
  • Is the data in system(s) or file(s)?
    Data that resides in a “system” or database is typically more flexible than in a “file”. File based data typically requires additional management and processes. This makes them prone to user errors.
  • Can data be captured as a natural byproduct of using the product or does it require a separate work activity?
    If it takes you a lot of time to log, capture or report on data, that’s a separate work activity. Any system that makes you feel like you need a separate cost code to account for your time just to use it, is likely not a good solution.
  • Does using the system help standardize data?
    Standardized data typically yields more value with higher reliability and often eliminates a lot of human error.
  • Will it help with “Aggregation of marginal gains”?
    Sometimes a solution’s value isn’t the “one big thing” it does rather that it does or helps facilitate a lot of small incremental improvements.
  • Can you get a return in 1-2 years?
    Don’t worry about predicting the future. Tech moves fast. Be cautious of any solution claiming they’re the “Future”.
  • Is it the lesser of 2 evils?
    Never a decision anyone likes to make but sometimes a problem is big enough and the benefits in other areas are significant enough that what you do compromise on is the lesser of two evils.
  • Does it get you closer to your vision?
    Sometimes you can’t implement the solution you want. It may not exist or it’s just too big a change for your organization to swallow. Don’t dismiss smaller changes over time. Nobody said it’s a permanent solution.
  • Are Licensing terms flexible?
    Paying the same licensing cost for part time users as full time is wasteful. Likewise, solutions that want a percentage of revenue can be costly. You can’t always choose licensing terms but they’re often negotiable to a point. Verify the percentage of revenue is revenue from processes the solution addresses. Try to limit licensing costs early on in the implementation…you’re not using the full solution on day one.
  • Can you replace the solution easily in the future?
    Nothing is forever. How easily will you be able to swap the solution for another should your needs change?

Your list of things to consider when choosing technology solutions can and should vary. The point is, we don’t have to be able to predict the future with laser accuracy to select good technology. Use common sense concepts and principals and you’ll be well positioned for an unknown future.

In Part 4 (my last in the series), I’ll cover some aspects and approaches to prioritization.

Autodesk Fabrication: Best Practice #13

Use “Match by Name only” in Database Settings

Do you have issues with duplicate entries in your Fabrication Database? These could be proxy entries…those followed by text enclosed within {brackets}. Or they could be identical..if someone made the proxy item permanent,

This can be caused by using the Strict matching setting in your database setting. It’s recommended to use Match by Name only.

When you use Strict naming, when you open drawings or MAJ files, the database settings within those files are compared to those in your configuration. If the data is deemed relevant and it varies, even something as small as a number 3 decimals vs. 4 decimals can add another entry into your configuration.

When using Match by Name only, as long as the name (and group) matches, the entry is considered the same and you don’t end up with duplicate entries.

Autodesk Fabrication: Best Practice #12

Compress Fabrication Data Files.

Autodesk Fabrication configurations can Compress their data files. It’s a good idea to have this enabled. Not only does this make the files smaller and take up less space, it makes them faster to load. This increases your performance as the data is expanded in memory as opposed to read more data from disk.

You can enable this option in your database settings. Doing this does not automatically compress existing data that’s not already compressed. The following image shows a suggested sequence of operations. This would both enable compression and compress the existing data.

  1. First Enable Compression by selecting the Compress File to Save Disk Space toggle. Future writes to data tables will be compressed when if they are configured to.
  2. Next, enable the toggles for Compress Database Files (.MAP) and Compress Item Files (.ITM) options. This will tell Fabrication to Compress the existing Database and Item files. Also, “unselect” the Compress Jobs (.ESJ .MAJ) option.
  3. Click the Compress Now button. This compresses the Database and ITM files but will not scan your ESTmep and CAMduct job files.
  4. Once compressed, select the Compress Jobs (.ESJ .MAJ) option. This will compress all Future ESJ and MAJ files but not existing ones. If you wanted, you could have left that option selected in Step 2. However it would significantly increase the time it takes to perform the compression process. Because most of your ESJ and MAJ files are likely past jobs, there’s really no value in processing them now….but you could.
  5. Press the OK Button to save these settings.

Check Settings for Each Product, Version and User of Each Computer

You should also know that these settings are NOT saved in your configuration. The file that stores these settings is located here…

C:\Users\<user>\AppData\Local\Autodesk\Fabrication <version>\<product>\UserOpt.MAP

<user> = User's Windows Login Name
<version> = Autodesk product version. (e.g. 2018, 2019, 2020, etc.)
<product> = Autodesk Fabrication product (e.g. CADmep, ESTmep, CAMduct)

You can tell by the folders, that this setting is stored separately for each user on a computer. Because each product and each version is part of the path, those variations need to be set too.

Because Best Practice #9 tells you to use only one version for database administration, version may seem unimportant. But it IS important to know when you upgrade to a newer version for administration. Those versions should also have these settings reviewed.

Every user who does work in your database, should check each product and version for those settings. If they don’t, your work may compress files while their work may decompressed them.

Because clicking this just once makes it do it’s magic in your database, you don’t need to click the Compress Now button for each version, user, product or computer. The options merely need to be Set., telling those products what they should/should compressed or decompressed.

Autodesk Fabrication: Best Practice #11

Don’t use Commas (,) in Database Entry Names, ITM File Names, Don’t Use Them Anywhere.

Similar to Best Practice #1 (Don’t use Double Quotes), you should avoid using commas. Commas are the delimiting character in a CSV file. Using a comma can throw off the data columns in data exports that use the CSV file format.

Below, you can see Autodesk let a comma slip into a file name in their Metric Configuration.Yes – Ancillary in Ancillary Kit

Intolerance of Tolerances

In a recent LinkedIn post, the topic of Tolerance Stacking was brought up. I’m not a machine designer, but I’ve spent a lot of my past life in Manufacturing. In that world, the term was used frequently. If the term was used in Construction, it certainly wasn’t when I was listening.

Tolerance Stacking can be described (in my mind) as the accumulation of allowable tolerances to a point where the design is no longer suitable for it’s intended purpose. Errors resulting from Tolerance Stacking are caused by a few things…

  • Lack of tolerance awareness
  • Poor annotation and documentation of tolerances
  • Both of the above

Tolerance Stacking Explained

The best way to understand Tolerance Stacking is from a few examples. In our first example, we see a part 10 Units long with 9 holes, equally spaced 1 Unit apart. Take note of the RED dimension on the right.

10 Unit Long Part with 9 Holes Spaced 1 Unit apart

You may have seen parts dimensioned like this. Looks pretty normal. Now lets consider this same part and assume the dimensions have a tolerance of +/- 0.0625 (1/16 Inch). Now lets also assume that all the dimensions are in the negative -0.0625. The following graphic illustrates this condition. Again, notice the RED dimension on the right.

Tolerance Stacking using the an allowed -0.0625 on each dimension.

Is the overall length really have enough tolerance to compensate for the accumulation of those tolerances?

Now lets look at the same part, same tolerances but annotated/documented differently. It’s not as “pretty” and takes up a lot more real estate on your drawing.

Same part as before but dimensioned differently.

But lets look at that same -0.0625 extreme case tolerance in this scenario. Once again, keep an eye on that RED dimension to the right.

Using an alternate annotation approach solves the Tolerance Stacking problem.

This latest example solves the Tolerance Stacking issue by clearly outlining where the tolerances are allowed. In fact, in construction, we’re already doing this. We just don’t call it Tolerance Stacking.

In construction, one of the ways we eliminate Tolerance Stacking is by dimensioning to gridlines and columns. Dimensions in relation to known fixed points minimized Tolerance Stacking.

Are you old enough to remember when rafters were layed out by hand on the job site and cut individually? You would cut one and use it as a template and use that to mark the others. You never installed the template and used the next cut as your template for another. This minimized Tolerance Stacking as well.

Geometric Dimensioning and Tolerancing – GDT

What’s less familiar, is another concept used heavily in automotive and other precision manufacturing. It’s called Geometric Dimensioning & Tolerancing or “GDT” for short.

Traditional linear tolerances have flaws. GDT on the other hand more accurately describes “features” and allowable deviation from the desired location using a more complex form of graphics and symbols.

Once again, the best way to explain this is with some illustrations. The following example shows a square part with a hole in the middle. Pay close attention to the RED dimensions.

Top Left – Perfect Part (not real world)
Top Right – Hole moved 0.0625 to the right
Btm Left – Hole moved 0.0625 up
Btm Right – Hole moved 0.0625 in both directions

In this example, you see when the hole is moved to the maximum tolerance in both directions, it’s actually further away from it’s desired position than 0.0625.

This is where GDT comes in. In this last example, GDT is used to “Describe” the allowable deviation from it’s ideal position.

GDT can more accurately describe tolerances.

There’s actually an ASME Standard for GDT (Y14.5.2) and a full explination of GDT is not only beyond the scope of this blog but my knowledge, There’s a lot of courses out there specifically for this but a good “101” description can be found here.

Given trends in Prefab, Modularization, and Construction becoming more like Manufacturing….Makes you wonder….should there be a “GDT” style of documentation for construction?

ESTmep Cost Exposed in Revit

If you’re a user of ESTmep and Revit Fabrication parts, consider yourself warned. I’ve recently had some dialog with an industry colleague and the discussion of Cost data in Revit came up.

We know that that a Revit file which uses Fabrication Parts contains a copy of your Fabrication Configuration (Database). We also know that the Fabrication Extension for Revit now allows you to run reports. Those reports can also report on Cost data. That’s generally a good thing in most firms using ESTmep, exposing that Cost data to Revit users can be very helpful.

Now when you send someone your Revit model, they do NOT have access to your database (Unless you send that to them a well). Without your database, the Fabrication Add-In will not find the reports and the option is grayed out.

You also can’t change the configuration either because the drop down is disabled. They need your database to do anything….maybe.

So this sounds like we’re OK but let me assure you that’s not the case. Your database isn’t “available” to the person who had your Revit file but it is contained within the Revit file itself. And even though the Revit API’s don’t give you access to the costing data, it can be extracted.

I won’t go into details for the sake of security in our industry but rest assured, there is a process where as a user can extract your cost data. This includes being able to figure our your vendor pricing multipliers.

What To Do?

That leaves the question about what to do. Some may be familiar with the option in Edit Configuration that disables the storing of EST tables in DWG files. This has NO effect or control of Revit. Sure would be nice if it did nit that’s not the case.

So there’s really 2 options that I can see….

  1. Remove or Rename the COST.MAP, ETIMES.MAP, FTIMES.MAP and SUPPLIER.MAP tables from your database. These are where labor rates, times and costs are stored. Without these tables,, Revit can not store this information in the model. If you’re previously had a Revit model with this information saved, rename/remove the files and reload your configuration and the data will be removed. The down size is you’ll no longer be able to use ESTmep.
  2. Make a copy of your database without the COST.MAP, ETIMES.MAP, FTIMES.MAP and SUPPLIER.MAP tables and have Revit point to that. Each time you update your Fabrication database, you’ll need to refresh this copy. It’s fairly easy to script this process and have those files removed. The down side is you’ll no longer have access to Cost data in Revit but at least you can keep using ESTmep internally.

If you feel this is unacceptable, please submit a support ticket with Autodesk. The more people that raise the issue, the more likely that it will be addressed in a future release or update. To date, all they told me is the option I’ve outlined are the ONLY way to address the issue.

IT Hack Every BIM/CAM Manager Should Know

How many times have you had to reconfigure a system because a server changed? Remapped Drive letters? Links that used UNC paths? It’s downright annoying when IT needs to replace a server and it’s name is different. But it is understandable…systems get old and outdated and need upgrading or replacing.

What I don’t understand is why we use server names at all. Most Ethernet networks use IP addresses to route traffic, not server names. But nobody can ever remember all those numbers. When you type a server name, it pings the DNS server to locate the IP address of that server. Makes sense right.

But we don’t have to use server names…or IP addresses. DNS can be configured with a CNAME Alias (CNAME – Canonical Name). A CNAME Alias is just another human friendly piece of text that’s used to point to another Server Name, IP address or even another CNAME Alias.

What a CNAME Alias do for us?

To understand what a CNAME Alias can do for us, lets take the example of license server. All your client software points to the server name…let’s say is’s something like “P-LA-LIC01“. Now your IT rolls out a new server for licensing…it’s not going to be “P-LA-LIC02“. All your clients need to be updated to the new server name. Depending on the sophistication of your IT and the software they mightbe able to push an update. But more often than not, the local CAD/BIM Manager is left updating clients.

With a CNAME Alias, you could create a nice user friendly name like “ADSK-LICENSE“. All your software would use this name instead of the server name.

This CNAME Alias is setup on your DNS server. Most IT groups won’t give you access but if you know how it works, you can request and have them set it up for you. Just tell them you want a CNAME Alias named “ADSK-LICENSE” that points to “V-LA-LIC01(ADSK-LICENSE -> V-LA-LIC01). Now when you’re ready to cut you users over to the new license server, have them update the DNS record for the alias to the new server overnight. The next morning, everybody is pointed to the new server, no reconfiguration required.

If someone has left on their computer, they may need to reboot to see the changes. In the unlikely event is still doesn’t work, they can open a DOS prompt and flush the DNS cache with the command line “IPCONFIG /FLUSHDNS

Where does this work?

Just about where ever and ever where you would type a server name. You can make a drive letter using an Alias. You can use an Alias in a UNC path. We can even specify them to point to the IP address of network printer or other equipment.

The only real down side to using a CNAME Alias is that it doesn’t show up when browsing your network, But all things considered, that’s not a bad idea. If it were up to me, no user anywhere would ever know the names of the servers…only the Aliases. I really don’t know why this trick isn’t used more often. IT uses it for their own purposes, it just rarely gets implemented to affect the users. Using this approach, I’ve migrated hundreds of users in multiple locations to new servers with a 3 second DNS update in the evening. I suggest you give it a try.