33 minute read

TIME TO THROW AWAY THE DISPOSABLE BUILDING MINDSET

CARBON EFFICIENCY

TIME TO THROW AWAY THE DISPOSABLE BUILDING MINDSET

BY NIALL MCSWEENEY

Our buildings have become as disposable as TVs or smartphones. But as we begin to count the true cost of embodied carbon, the way we design, construct and value our buildings will come under scrutiny.

CARBON EFFICIENCY

According to the World Green Building Council, buildings are responsible for 39% of global carbon emissions, 28% from operations and the remaining 11% from materials and construction. operational and financial performance. Investors are aligning their portfolios with the Paris Agreement, and are prioritising ESG and climate change in their investment strategies. The United Nations’ Net Zero Asset Owner Alliance, for example, now represents roughly $7.2 trillion in assets under management. Members of the alliance have pledged to transition their investment portfolios to net zero emissions by 2050. Meanwhile, the Climate Action 100+ group, representing more than 500 global investors with $61 trillion in assets, is calling out the world’s big companies for not moving fast enough on climate change. At the building scale, Bill Gates is championing the use of embodied carbon calculators, and has promised Microsoft’s massive overhaul of its 30-hectare campus in Redmond, Washington, will cut embodied carbon in building materials by at least 30 per cent on business-as-usual. Regulation is slowly on the move in the US and Europe, with decarbonisation targets and Building Code amendments set to limit embodied carbon in construction. Australia will have no choice but to follow suit if we are to meet our obligations under the Paris Agreement.

Would a price on upfront carbon impact the value of my asset?

As guardians of the purse strings, Quantity Surveyors know the cost of every screw and nail that goes into a building. So how can we apply this scrutiny to embodied carbon? There’s no real reason why value engineering decisions shouldn’t also take embodied carbon into account. Structural tweaks could reduce both costs and embodied carbon simultaneously, leaving Quantity Surveyors uniquely positioned to determine carbon efficiency. Until recently, our industry’s focus was fixed firmly on operational carbon. But as the world’s population approaches 10 billion by the middle of the century, and as the global building stock doubles in size, embodied or ‘upfront’ carbon will be responsible for half our carbon footprint. Some estimates suggest embodied carbon can account for up to 75% of a building’s total carbon footprint over its lifetime. As the construction sector starts a new sustainability conversation, investors are beginning to ask: Would a price on upfront carbon impact the value of my asset?

THE SIGNPOSTS POINT TO GREATER SCRUTINY

Environmental, social and governance (ESG) factors are increasingly influencing

MEASURING THE IMPACT OF MATERIALS

We have been talking about how to quantify the true carbon footprint of a building for decades. In Australia, the industry was scratching its heads as far back as the Sydney Olympic Games in 2000. But we have made little headway because the challenge is so complex. Buildings are constructed with three primary materials: concrete, steel, and timber. Let us take a closer look.

CEMENT

The cement sector is the thirdlargest industrial consumer in the world, responsible for around 7% of global carbon emissions. Cement is the key ingredient of concrete, and consumption is projected to increase by 12-23% by 2050. Cement (and steel, explored below) is produced at very high temperatures, making it energy intensive. The chemical reaction that occurs during manufacture also releases carbon dioxide, making it a material that is hard to abate.

The alternative to concrete is polymer, but the cost of polymers ranges from 10 to 100 times that of cement. Recent trials of self-healing polymer cement in the US could extend the life of concrete-based structures by 30 to 50 years, addressing the cost gap while reducing the volume of cement sent to landfill.

STEEL

Responsible for 7-9% of emissions, global steel production is forecast to grow by 30% over the next 30 years. Recycled secondary steel is expected to grow faster than primary production. The steel industry is making headway to reduce carbon intensity through hydrogen technology and carbon capture. But currently, the best way to reduce the carbon footprint of steel is to use less of it.

TIMBER

Engineered timber options, like crosslaminated timber (CLT), laminated veneer lumber (LVL), and glue-laminated timber (glulam) can be effective carbon reduction solutions, as a tonne of wood locks in around a tonne of carbon over a building’s lifetime.

CARBON EFFICIENCY

Timber construction can offer other benefits: a compressed construction schedule, reduced labour costs and enhanced safety from manufacturing offsite among them. Lighter-weight construction also minimises the amount of concrete needed in foundations.

In 2016, Australia’s National Construction Code introduced new ‘deemed-to-satisfy’ provisions for timber buildings up to 25 metres, or eight storeys high, provided they feature a raft of requirements, like fire sprinklers and fire-resistant cladding. But that does not mean timber is always the go-to solution. Tall towers may require 100,000 cubes of CLT – and trees do not grow fast enough to meet that sort of demand. to Melbourne air flights. But for many asset owners, there remains a tipping point at which any building upgrade becomes unfeasible – especially when valuations do not currently take into account demolition or recycling costs. Expect this to change as international standards elevate. The draft London Plan, for example, requires a whole life carbon assessment, which means new development proposals will need to consider both the embodied carbon impacts and operational energy performance. 1. Optimise existing structures: develop design strategies that repurpose existing assets through renovation and reuse

2. Choose materials with care: consider embodied carbon emissions from the outset, evaluating each material against its true lifecycle cost and lowcarbon alternatives

3. Plan for the future: consider the asset’s end of life, designing for disassembly and deconstruction to support future reuse and recycling 4. Embrace efficient construction: apply construction technologies and techniques that minimise waste on site 5. Collaborate to innovate: no one company or sector can solve this complex challenge alone, so work with industry groups, and establish partnerships with suppliers, clients, and customers

6. Support regulation: back progressive but predictable policies that incentivise leadership and encourage market movement towards net zero emissions.

With the right approach, tackling upfront carbon can become central to the cost management process.

…a new value engineering opportunity is emerging – one that goes beyond the traditional cost, time, and quality. With the right approach, tackling upfront carbon can become central to the cost management process.

NO LONGER DESIGNED FOR DISPOSAL

We also need to discard the idea of disposable buildings. Despite being made from materials that last millennia, many of our modern buildings have a lifespan of half a century. There are lighthouse examples of leadership. AMP Capital’s decision to retain around two-thirds of the original core of Quay Quarter Tower saved 6.1 million kilograms of carbon – equivalent to around 35,000 flights between Sydney There are big hurdles ahead, including standard methods of measurement of embodied carbon, incentives for the leaders and regulation for the laggards. But as building tenants increasingly assess their choices against their commitments to the Paris Agreement, we expect the embodied carbon content of a building will impact its value. As we move towards a net-zero carbon world, and as we throw away the disposable building mindset, a new value engineering opportunity is emerging – one that goes beyond the traditional cost, time, and quality. The secrets to solving the embodied carbon conundrum include:

Niall Mcsweeney is a Senior Director Of Altus Group Asia-Pacific

CONSTRUCTION DELAY

WHAT IS CONCURRENT DELAY? AN OVERVIEW

BY ROBERT GEMMELL

CONSTRUCTION DELAY

WHAT IS CONCURRENT DELAY?

There is no single definition of concurrent delay. Various definitions have been put forward and, in this article, I will review a selection of the definitions and descriptions of concurrent delay put forward by Marrin, Keating, the court in City Inn, the court in Adyard, the SCL Protocol 2nd edition, and some views from the United States. There are others but it is beyond the scope of this article to review them all.

MARRIN

Concurrent delay has been defined by Marrin as project delay caused by two or more effective causes of delay which are of approximately equal causative potency. In this case, there would only be concurrent delay if the effect of the delay by the employer and contractor event is felt at the same time.

Only in exceptional circumstances is concurrency of the kind Marrin defines likely to occur and this narrow definition is therefore possibly too limited.²

KEATING

Keating says that it is probably sufficient to say:³ • each delay event, in the absence of any competing event, has caused delay • each delay event is on the critical path • the delays caused by the employer and the contractor overlap.

CITY INN

In City Inn v Shepherd Construction, the court emphasised that there are problems in using expressions such as ‘concurrent delay’ or ‘concurrent events’ and said:⁴

“One of the problems in using such expressions as “concurrent delay” or

“concurrent events” is that they may refer to a number of different situations.

Confining attention for a moment to concurrent delaying events, which may be taken to mean relevant events and other events, or causes of delay, which are not relevant events, there would seem to be several possibilities. Such events may be described as being concurrent if they occur in time in a way in which they have common features.

One might describe events as concurrent on a strict approach only if they were contemporaneous or coextensive, in the sense that they shared a starting point and an end point in time. Alternatively, events might be said to be concurrent only in the sense that for some part of their duration they overlapped in time. Yet again, events might be said to be concurrent if they possessed a common starting point or a common end point. It might also be possible to describe events as concurrent in the broad sense that they possessed a causative influence upon some subsequent event, such as the completion of works, even though they did not overlap in time. In other words, they might also be said to be contributory to or co-operative in bringing about some subsequent event.”

References

¹ John Marrin QC, “Concurrent Delay” (2002) 18 Construction Law Journal 6 at 436, approved in Adyard Abu Dhabi v SD Marine Services [2011] EWHC 848 (Comm); J Marrin, “Concurrent Delay Revisited” (February 2013, SCL Paper No 179), M Cocklin, “International Approaches to the Legal Analysis of Concurrent Delay: Is There a Solution for English Law?” (April 2013, SCL Paper No 182); V Moran, “Causation in Construction Law — The Demise of the Dominant Cause Test?” (November 2014, SCL Paper No 190). ² Stephen Furst and Vivian Ramsey, Keating on Construction Contracts (10th ed, Sweet & Maxwell, 2016), [8-025]. ³ Stephen Furst and Vivian Ramsey, Keating on Construction Contracts (10th ed, Sweet & Maxwell, 2016), [8-025]. ⁴ City Inn v Shepherd Construction Ltd (2010) CSIH 68 CA101/00 at [49]

CONSTRUCTION DELAY

To summarise, the court in City Inn said: • On a strict approach, • events are concurrent only if they were contemporaneous or coextensive, in the sense that they shared a starting point and end point in time. • Alternatively, • Events may be concurrent: • if part of their duration overlapped in time

• if they possessed a common starting point or a common end point • if they possessed a causative influence upon some subsequent event, such as the completion of works, even though they did not overlap in time. Although the court in City Inn refers to the “events” being concurrent, it is submitted that the court is referring to the delay to progress rather than the event that caused the delay.

ADYARD

Sebsequent to City Inn, in Adyard,⁵ the court considered that:

“…there is only concurrency if both events in fact cause delay to the progress of the works and the delaying effect of the two events is felt at the same time.”

The court in Adyard also said:⁶

“… act relied must actually prevent the contractor from carrying out the works within the contract period or, in other words, must cause some actual delay.”

SCL PROTOCOL 2ND EDITION

The SCL Protocol 2nd edition contains various definitions and examples of how the analysis of delay events sould be considered, including matters on ‘delay’ and ‘concurrent delay’. It should be noted however, that each project including the applicable contract, will have its own unique variables and delay events, which may not be relevant to the models and procedures detailed for guidance in the SCL Protocol 2nd edition.

The SCL Protocol 2nd edition defines concurrent delay as follows:⁷

“10. Concurrent delay – effect on entitlement to EOT

True concurrent delay is the occurrence of two or more delay events at the same time, one an Employer Risk Event, the other a Contractor Risk Event, and the effects of which are felt at the same time.

For concurrent delay to exist, each of the

Employer Risk Event and the Contractor

Risk Event must be an effective cause of

Delay to Completion (i.e. the delays must both affect the critical path). 10.1 Concurrency is a contentious issue, both because there are differing views on the correct approach to dealing with concurrent delay when analysing entitlement to EOT and because there are differences about the meaning of concurrent delay itself.” The SCL Protocol 2nd edition definition of ‘true concurrent’ delay is similar to Marrin’s ‘narrow’ definition; that is, the occurance of two or more delay events at the same time, one an employer risk event and the other a contractor risk event, the effects of which are felt at the same time. However, the SCL Protocol adds to Marrin’s definition that “True concurrent delay is the occurrence of two or more delay events at the same time, one an Employer Risk Event, the other a Contractor Risk Event” in addition to the effects of the delaying events being felt at the same time.

The SCL Protocol 2nd edition also clarifies that for concurrent delay to exist, the employer and contractor delay events must both affect the critical path. The SCL Protocol 2nd edition also acknowledges:

“10.4 In contrast, a more common usage of the term ‘concurrent delay’ concerns the situation where two or more delay events arise at different times, but the effects of them are felt at the same time.”

AN SCL PROTOCOL 2ND EDITION SCENARIO

The SCL Protocol 2nd edition gives the following scenario as to whether an employer delay is an effective cause of delay to completion if that employer delay occurs after the commencement of the contractor delay to completion but continues in paralled with the contractor delay:

“10.7 From a legal perspective, there are two competing views as to whether an Employer Delay is an effective cause of Delay to Completion where it occurs after the commencement of the

Contractor Delay to Completion but continues in parallel with the Contractor

Delay. This can be illustrated by the following example: a Contractor Risk

Event will result in five weeks Contractor

Delay to Completion, delaying the contract completion date from 21 January to 25 February. Independently and a few weeks later, a variation is instructed on behalf of the Employer which, in the absence of the preceeding Contractor

References

⁵ Adyard Abu Dhabi v SD Marine Services [2011] EWHC 848 (Comm) para 279 ⁶ Adyard Abu Dhabi v SD Marine Services [2011] EWHC 848 (Comm) para 282 ⁷ Scl Protocol 2nd edition, guidance part B: Guidance on core principles page 30 to 32

CONSTRUCTION DELAY

Delay to Completion, would result in

Employer Delay to Completion from 1

February to 14 February.” In relation to the two competing views in the above scenrio, the SCL Protocol 2nd edition says:

“10.8 On one view, the two events are both effective causes of Delay to

Completion for the two-week period from 1 to 14 February because they each would have caused Delay to Completion in the absence of the other (with the subsequent delay from 15 February to 25 February caused by the Contractor

Risk Event alone). This view may be supported by older English appeal court cases (no doubt predating critical path analysis) which provide that if the failure to complete the works is due in part to the fault of both the Employer and the Contractor, liquidated damages will not be payable. In a situation like the example described in paragraph 10.7 above, it can be argued that both the

Employer Risk Event and the Contractor

Risk Event are in part the cause of the

Delay to Completion.”

“10.9 On the other view, the Empoyer

Delay will not result in the works being completed later than would otherwise have been the case because the works were already going to be delayed by a greater period because of the

Contractor Delay to Completion. Thus, the only effective cause of the Delay to Completion is the Contractor Risk

Event. This is the consistent position taken in recent lower-level English court decisions.”

What view does the SCL Protocol 2nd edition recommend?

“10.10 The Protocol recommends the latter of these two views, i.e. where an EOT application relating to the situation referred to in paragraph 10.7 above is being assessed, the Employer Risk

Event should be seen as not causing

Delay to Completion (and therefore there is no concurrency). Concurrent delay only arises where the Employer

Risk Event is shown to have caused

Delay to Completion or, in other words, caused critical delay (i.e. it is on the longest path) to completion.

The Protocol cautions that this recommendation would have to be reconsidered were an appeal court to take a different approach to this issue.” The SCL Protocol 2nd edition recommends the view that the employer risk event should not be seen as causing delay and that therefore there is no concurrency. The SCL Protocol 2nd edition is saying, in line with current english precedent on the point, that concurrent delay only arises where the employer risk event is shown to have caused delay to completion.

THE U.S.

The Court of Federal Claims in George Sollitt Co v U.S.,⁸ sets out the following definition of concurrent delay:

“The exact definition of concurrent delay is not readily apparent from its use in contract law, although it is a term which has both temporal and causation aspects.

Concurrent delays affect the same

‘delay period’. A concurrent delay is also independently sufficient to cause the delay days attributable to that source of delay.” The industry standard for delay analysis in the U.S., the American National Standards Institute/American Society of Civil Engineers/Construction Institute 67-17 says “concurrent delay can be described as a situation where two or more critical delays are occurring at the same time during all or a portion of the delay time frame in which the delays are occurring.”

SUMMARY

Terms and definitions used when referring to concurrent delay are, as illustrated above, variable, inconsistent and can be somewhat confusing. However, the following, it is submitted, is consistent in all definitions:

• two or more delay events (causes of delay) which delay work/activities that is required to complete the project • at least one delay is the responsibility of the employer and the other the responsibility of the contractor. Question: for there to be concurrent delay, is it necessary that the delays, as a result of the delay events, commence at approximately the same time or have the same impact on the projected completion date? Put another way, is it possible to have concurrent delay where the delays, as a result of the delay events, commence at different times but at some point, they overlap, and hence each delay treated in isoloation to the other would have different impacts on the completion date. In Adyard, the court said that “there is only concurrency if both events in fact cause delay to the progress of the works and the delaying effect of the two events is felt at the same time.” If this is correct, then in relation to the above question, isn’t the first delay merely creating, as a matter of fact, float for the second-in-line delay, meaning that, in such a situation there is no concurrent delay?

⁸ George Sollitt Co v U.S., 64 Fed. Cl. 229, 239 (2005).

TECHNOLOGY

NEGOTIATING THE BIM LEARNING CURVE

CONSIDERATIONS FOR YOUR FIRST 5D PROJECT

TECHNOLOGY

Building Information Modelling (BIM) is fundamental to the long-term growth of Australia’s construction industry. Given the economic implications at stake for our broader economy, we simply must keep working to unlock the significant potential promised by BIM. But what exactly is standing in our way? As of mid-2021, many of the obstacles hindering BIM adoption across the Australian industry have been addressed, slowly but surely. Bodies such as the Australasian BIM Advisory Board and buildingSMART Australasia act as champions for the BIM concept, working to promote useful frameworks and resources that can be adopted by businesses large and small. Progress has been made on open standardisation, while BIM is widely implemented commercially and often mandated on major infrastructure projects at a state level. The trajectory of BIM in this country has made it more important than ever for quantity surveyors and estimators to build a working knowledge of the process. Being tasked with working on a 5D BIM project for the first time can be a daunting prospect. BIM is a collaborative methodology that requires all involved stakeholders to ‘buy in’ and take an active early role in order to guarantee success, but often the quantity surveyor/estimator is a mere afterthought in these discussions. We have provided a list of helpful considerations and discussion points for those preparing to work on their first BIM project. The following points must be considered as part of the Employer ....Information Requirements (EIR) and BIM Execution (BEP) or Management Plan (BMP) that is developed between the project team and client at the start of a project.

A CHECKLIST FOR GETTING STARTED

The Australia and New Zealand BIM Best Practice Guidelines state that “… the quantity surveyor’s input early in the process is imperative to ensure the model is set-up with proper geometry and contains key information for effective cost planning.” These Guidelines go on to say that establishing a BEP can help project members to understand their roles and responsibilities for model creation, outline additional resources that may be needed, provide a baseline to measure progress and more. The importance of this kind of proactive evaluation cannot be overstated, given that issues encountered on BIM projects in the past have often been caused by avoidable oversight. We have divided this list of questions and thoughts into Fundamentals, Quantification and Wider Business Impacts. Quantity surveyors/estimators preparing for their first project may wish to pick and choose the discussion points most relevant to their role.

Building Information Modelling (BIM) is fundamental to the long-term growth of Australia’s construction industry.

TECHNOLOGY

FUNDAMENTALS

What will the model be used for?

This might sound simple enough on the face of it but is really the key driver that will inform everything else. Is it just a design tool, is it for clash detection/ coordination purposes, cost estimation, facilities management, etc.?

Will all disciplines be involved?

How are the Architectural, Structural and Mechanical & Electrical teams planning to work? Will they all be utilising BIM, or will some be providing 2D designs? Will you use a different model for each discipline, or will you use a federated model?

How will revisions be handled?

There can be many thousands of revisions to the model during the design phases – how frequently will the quantification and costings be updated? Advanced software can make it easy to track changes in quantity and cost through the revisioning functionality, but controls still need to be put in place.

What file formats will be utilised?

Leading software such as iTWO costX can open models in DWFx, RVT, CPIXML and IFC formats (among others). There are pros and cons to each option impacting upon computer hardware requirements, data completeness, proprietary vs open standards, etc. The quantity surveyor/estimator should satisfy themselves as to which format is best suited for the planned project, potentially requesting examples of each at an early stage.

How will the models be transmitted?

In accordance with the point above regarding revisions, how will the files be transmitted or stored? This particularly relates to when comparisons need to be carried out on the model to understand any cost changes. As some file-sharing options may incur costs, stakeholders must consider how these expenses will be shared if necessary.

What content will be modelled vs 2D detailed?

Frequently, not all aspects of a design will be modelled, and can just be manually drawn as detailing onto the 2D drawing outputs; for example, skirting boards within a building. The extent of this needs to be agreed upon beforehand and everyone is made aware so that the quantity surveyor/estimator can make appropriate allowances for it.

Will naming conventions be minimal or descriptive?

Rather than an object in the model being called something like “WallCavInsMsnry”, it can be useful for the names to be fully descriptive and more widely understood. These descriptions may also change over time between concept and detailed design phases, so the process for updating model content needs to be understood by all.

What is the contractual standing of the model?

What is the status of the model versus the 2D drawings, and can this be influenced at all?

QUANTIFICATION

What coding is going to be applied to the model?

The model is frequently coded with Uniclass data, whereas the quantity surveyor may be preparing an AIQS elemental cost plan or an ANZSMM Bill of Quantities, and there isn’t a complete (public) mapping between these coding systems. It may be possible for the designers to easily add relevant coding to the model objects, making the quantity surveyor’s takeoff much easier.

What rounding will be applied for

Project Units?

If every object is exported to zero decimal places for the quantity information, this can lead to discrepancies on large projects. Cost consultants must ensure that the designers set the rounding appropriately to suit requirements.

Should work be split into individual parts or assemblies?

Rather than having a single object of a composite slab exported, it can be useful to have the individual components exported and quantified (e.g. blinding, concrete, rebar, screed etc).

What are the deliverables?

How will the resulting cost plan, estimate or Bill of Quantities be shared with the client or other stakeholders?

How will quality assurance be handled?

How will checks be carried out at various stages of the process? For example, the model needs to be checked upon the first

TECHNOLOGY

receipt, and the final deliverable needs to be cross-referenced against the model or drawings (depending on the contract).

WIDER BUSINESS IMPACTS

How will success be measured?

Will the time taken for both the initial takeoff and future revisions be tracked and compared to other more manual workflows? What are the ambitions for quantification from the model versus 2D drawings – and is this calculated by item or by value?

How long is the learning curve allowed for?

The first time that teams undertake any new process there will inevitably be a learning curve, and this needs to be planned and allowed for to prevent staff returning to older methods. Will there be a senior sponsor within the organisation to support, promote and monitor progress?

How will this impact future work winnings?

How will this information be fed back to business development teams to be utilised in fee proposals, and how will fee proposals be written in future to specify minimum requirements for models?

How will you enable knowledge sharing?

The lessons learned on the initial 5D flagship projects need to be recorded and shared among the wider team to facilitate improved outcomes in the future. It is also worth noting that further discussions can be carried out in conjunction with the design process, rather than waiting for weeks and months for the design to be completed and then rushing to carry out checking and quantification. It may even be an option to request a sampling process, whereby the design teams provide the quantity surveyor/ estimator with samples of the content they are using. This can allow the quantity surveyor/estimator to check the proposed material in line with planned workflows and project-specific breakdowns. Proactive planning of this nature can support a quick costing and quantification process once the final model is delivered.

5D BIM: A DRIVER OF DIGITAL TRANSFORMATION

When looking at the industry through a wider lens, it is clear that many construction businesses are already enacting digital transformation plans to support their competitive future. Large and small enterprises have recognised the pressing need to innovate, given the untapped potential that has characterised our industry in recent years. BIM is a key driver of digital transformation in construction, as stated by the World Economic Forum in an expansive series of reports published with The Boston Consulting Group, entitled “Shaping the Future of Construction.” It was noted that everything from improved cost estimation to effective sequencing and clash detection can be delivered through intelligent use of BIM. The considerations covered in this piece merely scratch the surface of what quantity surveyors/estimators must be aware of before working with 5D BIM. While the learning curve may be steep, resources and industry knowledge are constantly improving to the benefit of those ready to get started. Advanced software platforms such as iTWO costX are available to support complex 5D workflows, with users able to view and takeoff quantities from 3D models before automatically linking to user-defined rate libraries and workbooks. Such programs are accessible to large and small businesses, with a variety of deployment options available to suit agile business requirements. In any case, quantity surveyors/estimators who remain unconvinced about the commercial advent of BIM must reevaluate their thinking. BIM is a proven methodology, and it is here to stay. Given the vast importance of our industry, it is integral that professionals from across disciplines keep endeavouring to realise the manifold benefits on offer.

Learn more about iTWO costX by RIB Software by visiting the website www.itwocostx.com. RIB Software has paid for and written this advertorial.

TECHNOLOGY

MACHINE LEARNING FOR QUANTITY SURVEYORS

BY CONG (CODY) BUI MAIQS

Big data, artificial intelligence (AI) and machine learning (ML) have been popular buzz words recently. However, we have not seen much of their applications in quantity surveying (QS) works. This article will demonstrate a practical case study which hopefully will clear some mist around ML and suggest some applications for quantity surveyors. First of all, let go through a quick introduction to Machine Learning. According to IBM (IBM, 2021), “Machine learning is a branch of artificial intelligence focused on building applications that learn from data and improve their accuracy over time without being programmed to do so.” The two main branches of ML (supervised ML and unsupervised ML) together with their typical applications are illustrated in Figure 1. Linear regression is a subclass of ML where data is pre-categorised. It is also the simplest ML algorithm and will be focused on in this article. The article will provide readers with a brief idea of how ML can be utilised by going through a data science competition that related to cost prediction which is a common challenge facing quantity surveyors.

…imagine the tremendous potentials of having a database of one thousand cost plans organised in a structured way

TECHNOLOGY

ABOUT THE CASE STUDY

The case study is a competition called House Prices: Advanced Regression Techniques posted on website kaggle. com (an online community of data scientists and machine learning practitioners). In this competition, the participants are provided with data of more than 1,200 houses in Ames, Iowa, USA. The dataset is comprised of 79 explanatory variables which describe almost every aspect of residential houses in the area. The contestants will then develop ML models aiming to predict the sale prices of other houses based on their characteristics. This competition bears a close resemblance to QS works of project cost estimation, thus makes a great example to demonstrate the potential usage of ML in QS works. For the sake of simplicity, some technical parts have been purposely left out and the article will demonstrate three general steps in the process of building a linear regression machine learning model, acquiring the data, and exploring the data and building a model.

ACQUIRING THE DATA

Acquiring the data is the very first step that we need to do in order to build a ML model. Nothing can be done until a certain amount of good data is collected. In this competition, the data has already been collected and processed. Most of the time, this may not be the case where data is readily available. In fact, data professionals spend up to 60% of their time on cleaning and organising data (Forbes, 2016). Thus, it is

CLASSICAL MACHINE LEARNING

SUPERVISED

(Data is pre -categorised or numerical)

CLASSIFICATION (Predict a category)

REGRESSION (Predict a number)

UNSUPERVISED

(Data is not labeled in any way)

CLUSTERING (Divide by similarity)

DIMENSION REDUCTION (Find hidden dependencies)

ASSOCIATION (Identify sequences)

Figure 1 - Classical Machine Learning (Blog, 2021)

important not to underestimate the time needed to collect and process raw data in the initial stage. The following are some variables extracted from the data:

• LotArea - lot size in square feet • YearBuilt - original construction date • Foundation - type of foundation • Bedroom - number of bedrooms above basement level

• Kitchen - number of kitchens

• Heating - type of heating • OverallQual - overall material and finish quality • GarageCars - size of garage in car capacity • RoofStyle - type of roof. As presented, some of the variables are discrete and some are continuous. For example, RoofStyle, which can be flat, gable, gambrel, hip, mansard, or shed, is discrete. On the other hand, LotArea is continuous and can be any number ranging from 100ft2 to 1,000ft2. This is a good example to start building up our own database. The data is saved in a CSV file with columns and rows describing the characteristics of the data and data points respectively. Most of the data available in QS firms, for example, cost plans and BOQs, are in the form of PDF, Excel or measurement software extracted files. As a result, great effort may need to be put in to make these data more structured and consistent as presented in the dataset of this case study. However, once a well-organised database is acquired, insights can be gained almost immediately even without deploying

TECHNOLOGY

advanced technique such as ML. It is suggested that data collection should be integrated as part of daily workflow, thus eliminating the tiresome data entry works.

EXPLORING THE DATA

Once a certain amount of data is obtained, non-programming visualisation tools such as Power BI or Tableau can be used to gain insights, spot trends, and detect outliers or potential errors. It is important to understand the nature of the data which then will help to choose the appropriate algorithms. For example, Figure 2 shows the relationship between Lot Area and Sale Price. It is easy to notice that most of the records are from building type 1Fam with the Lot area ranging from 5000 ft2 to 30,000 ft2. This indicates that our model will have greater accuracy when predicting sale prices for houses within this range. Statistical analysis can be put into use in this part for a greater understanding.

BUILDING A MACHINE LEARNING MODEL

As presented in the introduction, there are many different types of ML algorithms. It is important to choose the appropriate algorithms based on the type and amount of data as well as the expected application. In this case study, linear regression is among the many suitable algorithms that can be used to predict the sale prices of houses. Let’s dive in and explore a bit of theory. Mathematically, the equation of Linear Regression is as follow:

y = α + βx (1)

Where y is the value to be predicted based on the given value of x.

Relationship Between Lot Area and Sale Price 700K

650K

600K

550K

500K

450K

400K

350K

300K

250K

200K

150K

100K

50K

0K

2K 4K 6K 8K 10K 12K 14K 16K 18K 20K 22K 24K 26K 28K 30K 32K 34K

Lot Area Bldg Type 1Fam 2fmCon Duplex Twnhs TwnhsE

Figure 2 Relationship between Lot Area and Sale Price (author)

Works Cited

Ashish. (2019, May 1). Kaggle. Retrieved from https://www.kaggle.com/ashydv/housing-price-prediction-linear-regression Blog, V. (2021, May 1). Machine Learning. Retrieved from https://vas3k.com/blog/machine_learning/?fbclid=IwAR0NjjOJlZt4-KiaBGi11DskcBHAa2d 6xaUchkPZdDch7pxS5sbcrZkUBJA Bremer, M. (2012, January 1). Cornell University. Retrieved from http://mezeylab.cb.bscb.cornell.edu/labmembers/documents/supplement%205%20 -%20multiple%20regression.pdf Forbes. (2016, March 23). Retrieved from https://www.forbes.com/sites/gilpress/2016/03/23/data-preparation-most-time-consuming-leastenjoyable-data-science-task-survey-says/?sh=6a9c2f146f63

TECHNOLOGY

The black line in Figure 3 indicates the best fit straight-line based on the given datapoints. The line, or its equation, can then be used to estimate the values of the dependent variable. With the same principle, a multiple linear regression model with k predictor variables x1, x2, ..., xk and a response y, can be written as:

y= β0 + β1x1 + β2x2 + β3x3 + ... + βkxk + ε (2) (Bremer, 2012)

In our case, the predictor variables x1, x2, x3 are the aforementioned variables including: LotArea (lot size in square feet), YearBuilt, Foundation (type of foundation), Bedroom (number of bedrooms above basement level), Kitchen (number of kitchens), Heating (type of heating), etc. In principle, if we can best estimate the values of the coefficients β0 , β1 , β2 , β3 and ε then we will have the formula for the prediction of sale price y. There is a bit of mathematic theory involved in estimating the coefficients βx . Fortunately, these days with the help of programming, the above formular can be solved in with just a little coding. In one estimation (Ashish, 2019), formula (2) has been solved as: Price = 0.35×area + 0.20×bathrooms + 0.19×stories + 0.10×airconditioning + 0.10×parking + 0.11×prefarea

LOOKING FORWARD

Although there is much more to be done until some real benefits can be drawn from Machine Learning, it holds a lot of potentials. There might be a great amount of data already existed

Y

Dependent Variables Data Points

Line of regression

Independent Variables

Figure 3 Line of Regression (Prasrahul, 2020)

in organisations’ database such as cost plans, BOQs, quotes, variation assessments, and progress claims. This data can be made used of and turn into new valuable assets. Two or three cost plans may not be of much significance but imagine the tremendous potentials of having a database of one thousand cost plans organised in a structured way. Hopefully, this article has provided readers with some basic knowledge about Machine Learning as well as a new perspective on the method and how to handle data.

The author is keen to be in touch with readers of the same interest. Please send any comments or suggestions to codybmcm@gmail.com. Disclaimer: The information contained in this article is accurate to the best of the author’s knowledge. No warranty or guarantee is expressed or implied regarding the accuracy of any information or data.

IBM. (2021, May 1). IBM Cloud Learn Hub. Retrieved from https://www.ibm.com/cloud/learn/machine-learning Kaggle. (2021, May 1). Retrieved from House Prices - Advanced Regression Techniques: https://www.kaggle.com/c/house-prices-advancedregression-techniques/overview Prasrahul. (2020, August 20). Medium. Retrieved from https://medium.com/analytics-vidhya/hype-around-machine-learning-1a80283d7655