Wednesday, October 31, 2018

A Good Home Valuation System Allows Time Adjustments and Flexible Valuation Dates

(Click on the image to enlarge)
Since most home valuation sites comprise at least one year worth of historical sales, they must (educate and) allow users to time-adjust the older sales considering the fact that four very similar sales from the prior quarters - unadjusted for the time of sale - are not usable as comps. 

Additionally, they should allow users the flexibility to choose a valuation date other than the current date, meaning a forward or a backward date as well. When a backward date is selected, the newer sales (past the valuation date) would be adjusted back to that date. This flexibility allows Tax Assessors and Portfolio Managers to choose a different valuation date in line with their roll/portfolio requirements, often experimentally (e.g., does a forward sales sample validate the output of the modeling sales sample?).    

The first snapshot shows how to time-adjust sales at 12% GROWTH annually (1.00% per mo) to arrive at the subject value for a forward date of 01-31-2018, while the second snapshot shows a time adjustment @ 6% annually (0.50% per mo) to a backward date of 06-30-2017. 
Of course, to account for a negative price growth, use negative (-) in front of the adjustment.

Upon application of the time adjustment to the sales population, the sale prices are replaced in the analysis by the adjusted sale prices, contributing to the valuation of the subject.

I picked the above graphics from as I own and operate it, to avoid having to deal with any copyright issues. My Homequant site is totally self-directed (no modeled values), totally free (no strings), and requires no login or registration whatsoever. Please use the system that works best for you.

All time adjustments in Homequant are linear. In Automated Valuation Modeling (AVM), non-linear adjustments are often used as AVMs usually require longer sales stretches, generally 18 to 24 months. If you are trying to understand how to make advanced non-linear adjustments, please check out my recent book on AVM "An Illustrated Guide to Automated Valuation Modeling (AVM) in Excel..." on Amazon. If you are a new graduate/analyst, request a complimentary copy of the book.

Tuesday, October 30, 2018

SkylineValue Offers Custom AVM Service for Mini Storage Properties

A Good Home Valuation System Allows Users to Differentiate between Sales and Comparable Sales

Sales vs. Comparable Sales

A list of sales - by default - does not become comparable sales ("comps"). Sales - even when drawn from the same neighborhood - must be quantitatively adjusted for characteristics and time to become comps. Once adjusted, the differences in property characteristics, distance and time (01/2017 and 12/2017 sales are not the same) become irrelevant. 

So, always ask your Broker to show how the comps have been adjusted. 

Here is a snapshot of the adjustment process:

(Click on the image to enlarge)
The above table shows that although these are the 10 best pooled sales to value the defined subject, they are quite different in terms of the distance, time of sale, size and age so they have to be quantitatively adjusted (using sound econometric parameters drawn from the local market -- explained at length in other posts), to be considered and accepted as comps, absent which they would remain as some random sales.

Once they are adjusted, the Comps Grid will show the line item adjustments as well as the total adjustment for each of the final five comps:

(Click on the image to enlarge)
Upon adjustment to the sales, the Sale Prices are replaced in the analysis by the Adjusted Sale Prices (ASP) of the comps, contributing collectively to the valuation of the subject. The Comps Grid here has been ranked by 'Distance' meaning the comp closest to the subject becomes the Comp #1. 

I picked the above graphics from as I own and operate it, to avoid having to deal with any copyright issues. My Homequant site is totally self-directed (no modeled values), totally free (no strings), and requires no login or registrations whatsoever. Please choose the site that works best for you.

Monday, October 29, 2018

How to Analyze and Present Large and Complex Home Sales Data – in 30 Minutes (2 of 2)

-- Intended for Start-up Analysts and Researchers --

In our prior post (1 of 2) we talked about analyzing and presenting a large and complex dataset in 30 minutes. Would you handle it differently if you had 60 minutes? Here is one approach you might like to consider:

1. Just because you are starting out, do not underestimate yourself. The very fact that you have been tasked with this critical presentation speaks volumes, so take full advantage of this visibility in narrowing the competition down. These meetings are often frequented by other department heads and high-level client representatives, leading to significant loss of time in unrelated (business) discussions. The best way to prepare for such contingencies is to split the presentation up into a two-phase solution where phase-1 leads seamlessly to phase-2. 

2. In a business environment, it's never a good idea to start with a complicated stat/econ model. Start a bit slow but use your analytical acumen and presentation skill to gradually force people to converge on the same page, thus retaining maximum control over the presentation (time and theme). Therefore, the phase-1 solution should be the same as the full* 30-min solution we detailed before (*including the sub-market analysis). Even if the meeting leads to unrelated business chit-chat, off and on, you will still be able to squeeze in the phase-1 solution, thus offering at least a baseline solution. Alternatively, if you have one all-encompassing solution, you will end up offering virtually nothing. 

3. Now that you have finished presenting the phase-1, establishing a meaningful baseline, you are ready to transition to the higher-up phase-2 solution. In other words, it's time to show off your modeling knowledge. In phase-1 you presented a baseline Champ-Challenger analysis (Champ=Median Sale Price, MoM; Challenger=Median SP/SF, MoM). You used the "Median" to avoid having to clean up the dataset for major outliers. Here is the caveat though: Sales, individually, are mostly judgment calls; for example, someone bent on buying a pink house would overpay; an investor would underpay by luring a seller with a cash offer, etc. In the middle (middle 68% of the bell curve), the so-called informed buyers would use five comps, usually hand-picked by the salespeople, to value their subjects - not an exact science either.   

4. Now, let's envision where you would be at this stage - 30 minutes on hand and brimming with confidence. But it's not enough time to try to develop and present a true multi-stage, multi-cycle AVM (see my recent post on 'How to Build A Better AVM'). So, settle for a straight-forward Regression-based modeling solution, allowing time for a few new slides to the original presentation. Build the model as one log equation with limited number of variables (though covering all of the three major categories). Variables you might like to choose: Living Area, Age, Bldg Style, Grade, Condition and School/Assessing District. Avoid 2nd tier variables (e.g., Garage SF, View, Site Elevation, etc.).

5. Derive the time adjustment factors from phase-1 (it's a MoM) and create Time Adjusted Sale Price (ASP), the dependent variable in your Regression model. Explain this connection in your presentation so the audience (including your SVP/EVP boss) knows the two phases are not mutually exclusive, rather one is the stepping stone to the other. At this point, you could face this question "Why did you split it up into two?" Keep you answer short and truthful: "It's a time-based contingency plan."

6. Keep the Regression output handy but do not insert it into the main presentation as it is a log model (audience may not be able to relate to the log parameter estimates). If the issue comes up, talk about the three important aspects of the model: a) variable selection (how you managed to represent all three categories), b) most important variables as judged by the model (walk down on the t-stat and p-value) and c) overall accuracy of the model (r-squared, f-statistics, confidence, etc.).    

7. Present model results in two simple steps. Value Step: ASP vs. Regression values. Show the entire percentile curve - 1st to 99th. Point out the smoothness of the Regression values vis-a-vis ASP. Even arms-length sales tend to be somewhat irrational on both ends of the curve (<=5th and >=95th). Standard deviation of the Regression values would be much lower than ASP's. Ratio Step: Run stats on the Regression Ratio (Regression Value to ASP). It's easier to explain the Regression Ratios than the natural numbers so spend more time on the ratios.    

8. Time permitting, run the above stats both ways - with and without outliers. Define outliers by the Regression Ratios. Keep it simple; example: remove all ratios below the 5th and above the 95th percentile or below 70 and above 143, etc. Considering this is the outlier-free output, run Std Dev, COV, COD etc. These stats would be significantly better than the prior (with outliers) ones. Another common outlier question is: "Why no waterfront in your model?" The answer is simple: Generally, waterfront parcels comprise less than 5% of the population, hence difficult to test representativeness. FYI - in an actual AVM, if sold waterfront parcels properly represent the waterfront population, it could be tried in the model, as long as it clears the multi-collinearity test.  

9. Last but least, be prepared to face an obvious question: "What is the point of developing this model?" Here is the answer: A sale price is more than a handful of top-line comps. It comprises an array of important variables like size, age, land and building characteristics, fixed and micro locations, etc. so only a multivariate model can do justice to sale price by properly capturing and representing all of these variables. The output from this Regression model is the statistically significant market replica of the sales population. Moreover, this model can be applied on to the unsold population to generate very meaningful market values. Simply put, this Regression model is an econometric market solution. Granted, the unsold population could be comp'd but that's a very time-consuming and subjective process.

Ace the next presentation. Be a hero. Prove to your bosses you are a future CEO.

Good Luck!

- Sid Som, MBA, MIM
President, Homequant, Inc.

How to Analyze and Present Large and Complex Home Sales Data – in 30 Minutes (1 of 2)

      -- Intended for Start-up Analysts and Researchers --

If you have very limited time - say 30 minutes - to summarize and present a fairly large and complex home sales dataset, comprising 18 months of data, with 30K rows and 10 variables, here is one approach you might like to consider:

1. Given the limited time, instead of trying to crunch the data in a spreadsheet, invoke your favorite statistical software like SAS. What SAS will do in four short statements (Proc Means, Var, Class and Output) and in matter of minutes, you will need much longer to accomplish the same in spreadsheets. When you are starting out, take full advantage of these types of highly visible - often rewarding - challenges to narrow your competition down.

2. Have a realistic game plan. Instead of shooting for an array of parameters, start with the most significant one, i.e., Monthly Median Sale Price (and the normalized sale price). Since median is not prone to outliers, you do not have to edit the dataset for outliers, saving significant amount of time.  

3. Now that you have the monthly median prices, you are ready to create graphs for the presentation. While you may create one graph depicting both prices (Y1 and Y2) against months (X axis), keep them separated for ease of presentation. 

4. If you are more comfortable graphing in Excel (in fairness to the remaining time), transfer the output from SAS to Excel. Make sure your graphs are properly annotated and dressed up with axis titles, legends, gridlines, etc. Remember, just doing things right is not good enough, learn to do things elegantly as well. 

5. Since you have summarized and rolled up so much of data behind one or two graphs, make sure they not only tell the overall story, but also convey enough business intelligence to make you look like a hero in front of your EVP/SVP. In the presence of clients, it enhances their image as well. So, add trendlines alongside the data trend. Select the primary trendline by eyeballing the data trend (linear, logarithmic, polynomial, etc.). Also, add a moving average trendline to iron out any monthly aberrations. When the series is extended, use 3-month moving averages.     

6. Keep your reporting verbiage clear and concise. Explain the makeup of the dataset; methodology including the use of monthly medians; how the normalized prices add value and help validate the primary; trendlines and their statistical significance; other statistical measures like r-squared, slopes, etc. you might display on the graphs (avoid printing equations on the graphs). 

7. Add business intelligence to your talking points. First off, stick to the market you are presenting but show off your knowledge of that market by highlighting: possible headwinds and tailwinds; how that market would react to an inverted yield curve; is there a structural shift in demand for homes (are more millennial showing interest in that market); what is the NAR's prediction of the summer inventory there; is the inventory of affordable homes on the rise there; any expected change to the FHA to help first-time homebuyers in general, etc. etc. 

8. Try to control the conversation by sticking to what you have, rather than what you don't have. For example, out of the 10 variables, you managed to use only 3 (Sale Price, Sale Date and Bldg SF), so do not start a conversation about the other important variables - Lot size, Age, Bldg Characteristics and Location - you had to leave out ('If I had 30 more minutes' would be a wrong hypothesis to test). If that question comes up, answer it intelligently and truthfully emphasizing, of course, the utility of the 3 you happened to use.

9. Let's assume that you managed to complete the first cycle (as indicated above) in 20 minutes. In that case, go back to SAS and crunch the sales analysis by the sub-markets (Remember: Location! Location! Location!). This is how you walk down on the analysis curve. Have these printouts handy, but do not try to alter the initial presentation.

Ace the next presentation. Be a hero. Prove to your bosses you are a future CEO.

Good Luck!
- Sid Som, MBA, MIM
President, Homequant, Inc.

Homequant Offers Custom AVMs and Comps to Support Tax Jurisdictional ReVals

Sunday, October 28, 2018

A Step-by-Step Guide to Home Valuation with Comparable Sales

Step-1: Define the Subject

(Click on the image to enlarge)

Once you zero in on a particular property you are interested in ("Subject"), compare the listed data (MLS or other listing services) with the County database (public record and available online). If they are at variance, call the Assessor's office and ask for an explanation. Land data could differ slightly as more and more public offices use GIS algorithms, while sellers/listing agents would extract the data from the original documents, potentially paving the way for limited discrepancies. Building SF and Year Built must be close, if not identical.

Step-2: Define Comps Criteria

(Click on the image to enlarge)

Comps criteria collectively is a function of the sub-market (greater neighborhood) the comps would be drawn from. In a very liquid market (with enough recent arms-length sales) the range could be tighter and vice versa. Since the comps must be similar to the subject in physical attributes, a set of selection ranges needs to be defined; similarly, adjustment rates are needed to equalize the differentials. For example, while the subject is 2,000, the comps could range between 1,600 and 2,400, thus requiring dollar adjustments. The 2,400 SF comp must be adjusted down to 2,000 SF while the 1,600 SF must be adjusted up, at the local replacement cost new($100/SF in the example). The rates could be significantly higher in expensive coastal markets while lower in rural areas.

Since the comps database might comprise an admixture of older and newer sales (generally 12 to 24 months depending on the liquidity of the market), all comps must be time-adjusted to a particular valuation date, thereby making sale dates for the pooled comps irrelevant. Once time-adjusted, there is no difference between two sales occurring in two different quarters. The sub-market in the above example did much better than its peers so an annual growth of 12% (1% per month) has been used to adjust all sales up to the valuation date. This piece of research (collecting the growth data at the sub-market level) is important. In a declining market, the adjustment would be negative meaning decay in value. Any quality market-oriented application would allow all three valuation dates: current, forward and backward.

Step-3: Select Comps

(Click on the image to enlarge)

Based on the selection criteria set forth, most self-directed valuation systems will return a pool of up to 10 most recent comps, five of which will eventually contribute to the subject value. I am using the Homequant system as I own and operate it, to avoid having to deal with any copyright issues. These are Homequant system requirements (a pool of up to 10 and 5 comps to value a subject). You may find another system online with different scoring requirements (choose the one that works best for you).

Assuming you have more than 5 (in our example, we got all 10), you have to evaluate them and choose the best 5. There are three most common methods to evaluate the pool: Distance (comps that are closest to the subject), Recency (most recent sales) and Least Adjustments (comps that require least adjustments, ignoring signs, that is, -2,500 and +2,500 are considered identical contributors). We chose the Distance method; in other words, we chose the 5 comps that are closest - in linear distance - to the subject. Of course, before you start the evaluation process, remove the two outliers, as a rule: the two comps with minimum and maximum adjusted sale prices; in the above example, comp # 1 and 10 are the two outliers. Again, removal of outliers is possible if the pool contains more than 5 comps. Notice that the lineup of the resulting 5 comps looks statistically meaningful.

(Click on the image to enlarge)

Irrespective of the evaluation methodology being chosen, comps must be simultaneously reviewed spatially, meaning reviewing the precise location of the comps on the map is equally important. The reason is simple: often, despite meeting the distance criteria, certain comps might come from incompatible yet contiguous neighborhoods. For example, since our subject is away from the lake, comps from the lakefront block (comps #7) would be inappropriate, although the distance criteria could have been met. Therefore, the spatial review of the comps is critical.   

Step-4: Analyze Final Value

(Click on the image to enlarge)

The final valuation picture is generally depicted via a tabular form called the Comps Grid. It's a line item comparative analysis of the subject vis-à-vis the final five comps that contribute to the subject value. It shows the neighborhood(s) they are drawn from, respective distances from the subject, property characteristics with dollar adjustments, and the sales complex including time adjustments. All of this collectively translates to the subject value. The most probable value is usually the median value of the adjusted sale prices of the five comps, while the most probable value range represents the statistical bound between the 25th and 75th percentile values. Of course, these parameters are specific to the Homequant system and may vary by the application or the target audience. For example, an alternative system that is geared towards the short-term investors might expand the probable value range to the 5th and 95th percentiles, thus revealing the potential short-term investors' entry and exit points. 

I picked the above graphics from as I own and operate it, to avoid having to deal with any copyright issues. My Homequant site is totally self-directed (no modeled values), totally free (no strings), and requires no login or registration whatsoever. Please choose the self-directed site that works best for you.

First-time Homebuyers must Start Research at a "Top-down" Valuation Site

(Click on the image to enlarge)

A Top-down home valuation site is one that allows users to work up the value of a "simulated" home without having to deal with a series of random comps. A good Top-down site generally offers the following features:

1. Sub-markets: All (socio-economically) prominent sub-markets within the market (say, Orlando) are generally supported, allowing users to toggle between sub-markets to evaluate and understand the variations in home values.

2. Home Type and Style: Home types (Detached, Attached, HOA, Townhouse, Condo, etc.) and styles (Ranch, Cape, Colonial, Conventional, Contemporary, Tudor, etc.) are important considerations for home-buyers so a good Top-down site incorporates them.

3. Location: A good school district tends to fetch a higher value than its counterparts with lesser known schools. Good sites therefore allow users to understand how such qualitative factors quantitatively contribute to the home value.

4. land and Building Sizes: Users are allowed to educate themselves how the changes (increase/decrease) in sizes impact values within a given sub-market. Some sites would allow users to further differentiate between total improved area and heated area, corner lot vs. non-corner, etc. Bath count is also an important consideration as it helps understand if the home is optimized or a lifestyle one.

5. Building Age and Condition: Users can quickly learn how age and overall condition (including quality of rehab) impact values in a sub-market. Some sites might combine these two variables into one called effective age. Either way, these are important considerations in pre-owned homes.

6. View: A waterfront home could fetch significantly higher value than a non-waterfront one within the same sub-market. Similarly, a house with other enhancing views (park, bridge, skyline, golf course, etc.) could be pricier.

7. Amenities: Central A/C, In-ground Pool, Upgraded Porch, Tennis/Basketball Court, etc. often add value to homes so a good site would allow users to experiment with such options as well. 

Case Study

Our First-time Homebuyer = John Doe

John must be methodical in his research leading to the home buying. After a pre-qual of $300K, he has decided to focus on two Orlando-area sub-markets: Maitland and Winter Park. 

He finds a Top-down site which allows him to perform his research without having to work up some random comps. He realizes that while Winter Park has beautiful tree-lined streets, he gets more modern and slightly bigger homes in Maitland (a screened-in Pool could be a bonus). He is very happy that the site allows him to evaluate numerous possible combinations including location, type, size, style, amenities and view. He also notices that the site meaningfully curves values as home size increases.

I picked the above graphics from as I own and operate it, to avoid having to deal with any copyright issues. My Homeyada site is Mobile-friendly (no separate apps are needed), totally self-directed (no modeled values), totally free (no strings), and requires no login or registrations whatsoever. It has a built-in non-linear value curve/scale tied to the home size, as well. Anyway, please use the site/system that works best for you.

Thursday, October 25, 2018

Formation of a Upward Sloping Triangle often Represents a New Bullish Market Trend

(Click on the image to enlarge)
The Median Sale Price chart (top) shows the meteoric growth in the first half of 2017, followed by healthy sideways within a tight range of $545K to $555K. Even the December drop - nothing spectacular - was quickly reversed, pointing to solid market fundamentals. 

The fact that the 2-Month Moving Average trendline follows closely the data line indicates a fairly non-volatile price curve. Both trendlines are however confirming the reversal, including the peak. Any breakout above the prior high of $557K would make the curve even more backward-bending (more bullish).

The normalized Price per SF trend (bottom chart) is so bullish that a linear trendline was in order (of course, for demonstration purpose). The formation of the upward sloping triangle further adds to the bullishness. 

In an event like this in the equity market, traders would initiate long positions. 

Again, the moving average trendline proves the lack of month-to-month volatility, though confirming the reversal as well as the fantastic breakout.       

- Sid Som, MBA, MIM
President, Homequant, Inc.

Homequant Offers Automated Valuation Modeling for Property Tax Appeals