RESO (RETS) 2014 Spring Meeting LiveBlog

The RETS family of real estate data standards is becoming increasingly important, and the process is moving at a fast pace. There were over 130 attendees at the spring meeting this year – twice what it was last year. The fall meeting dates are tentatively October 21-23, 2014. Following are some highlights from the meetings in Chicago.

RESO Spring 2014 Meeting

Kickoff & RESO Update (Rebecca Jensen / Robert Gottesman)

Bob G. at RESO Spring 2014The goals of RESO are to facilitate plug and play software, facilitate the seamless electronic transaction between software, reduce costs, improve data accuracy, provide a framework for neutral collaboration, and foster innovation. The work today is done by very few people – mostly volunteers. What RESO needs more than anything is for people to participate – not just to sit in on the work group calls, but to participate and help create the standard.

In terms of funding, over half of RESO efforts are funded by 58 MLSs, and there are a half dozen new MLS members this year. But there are also several new vendor members, including Delta Media Group, General internet, ListTrac, N-Play, Retsly, Trulia and Zillow. There are also three new broker members (Agent Ace, Barb Hassan Realty, and VIP Realty) – previously there was only one. There’s also a new member class – class “E” – for other associations – and the new member is The Canadian Real Estate Association. Billing is about to move to January 1st – for budgeting purposes. Existing members will see a prorated bill on renewal.

Very soon there will be an official “ratification of standards” on all of the work that has happened over the past few years. The certification system is in beta and will soon be deployed. The ratification will start a one year clock on compliance (for NAR-related organizations) based on NAR rules.

The Member Intellectual Property Rights Agreement has been amended so that if the developed standards infringe on a member patent, if a member doesn’t speak up during a 60 day comment period, then it is assumed that RESO et al can use the standard without infringing. [Note: this may not be exactly how an attorney may describe it and I'm not sure I've described it right. See the agreement for more detail - though it's not yet posted on the website. But soon.] There’s also a trademark agreement and an anti-trust policy that have been implemented over the past year. Much of this will be emailed to members soon.

Many major milestones have been accomplished this year:Data Dictionary: version 1.3 is ready for production

  • Transport Web API: version 1 is complete and implementation is underway
  • Research and Development: Property Unique ID Workgroup started
  • Certification & Compliance: RETS 1.7.2 / 1.8 ready for rollout
  • Member Collaboration System: all workgroups & committees converted

While there is ongoing work to develop and deliver meaningful standards, RESO will also be working to promote the adoption and deployment of RESO standards, encourage membership, and improve infrastructure (collaboration & the web site).

Marketing and Membership: New Initiatives for a New RESO (Tracy Weir / Ira Luntz)

The goals of marketing are to build visibility for key RESO intiatives, develop a community around RESO to spread the word, and rebrand RESO and make it easy to understand and access. The goals of membership are to maintian and increase RESO membership and build adoption of key initiatives: data dictionary, RETS 1.8, and the API.

There’s going to be a new logo – more flexible than the old one, and much more modern. The logo will be on the website when rules for use are finalized and codified.

RESO new logo photo

In 2014 RESO will improve the website, social networking (leveraging members’ social footprint), and the blog. The blog will contain posts and content from work groups and the executive director. new content will be published at least bi-weekly, and promoted via social media. The goal is to connect website/blog with social to drive traffic and engagement. Content will also be distributed to email subscribers.

Data Dictionary (Rob Larson)

Rob Larson - spring 2014The RESO Data Dictionary serves national standard for the field names and lookups (enumerations). New versions of the data dictionary are about “growth”. There’s some change, but it’s not taken lightly. In 1.3, there are 135 changes: 42 field additions and 93 modifications to many areas of the dictionary: proerties, member/office, contacts, media, saved search, open house, and enumerations. There were also some resource name adjustments to suppport schema. Commercial was added to the property resource, including three classes: commercial sale, commercial lease, and business opportunity. There are 387 commercial fields – but only 24 field additions needed to support it overall. A DTD for RETS 1.x compatibility is coming soon.

See: http://www.reso.org/data-dictionary-1-3

In terms of what are considered “core fields”, these will be fields required for basic certification. They will required only if you have such data. The determination of what is core is in progress. key listing fields are not so much about features (these vary from market to market?). There will be core fields for property, member, office, media, and open house. Certification will be beyond “core” with a graded system. The advantage of adopting as many fields as possible is that vendors will be able to plug in easier and do more.

Dictionary version 1.4 will be introduced in the fall of 2014. The focus will be growth, rather than change. The focus will be on enumerations, listing resource (and property?), schedule and feedback resources (this is effectively the ‘prospect’ data, gluing existing contact and saved search resources together), queue resource (improving synchronization) and “kill” (deleted items) resources, as well as rule and security resources. Rules would be a separate resource – split into less complex and describable rules (a resource, using JavaScript to describe?) and more complex functions (i.e. DOM) – which may be verbose descriptions rather than actual code or pseudo-code.

 RESO 2014 Projects (Paul Desormeaux)

The Research & Development group gets input from the board of directors and work group members. Business case justification for solving specific problems are discussed. Ideas that pass muster may be assigned by the board to a work group for development.

Three items that are getting attention now relate to Unique IDs: PUID (property unique ID), OID (organization unique ID – i.e. MLS, board & associations, brokers, vendors, title companies, appraisers, etc.), and IUID (individual unique ID – avoiding duplication and confusion as agents belong to different MLSs, moving between MLSs, etc.). More on PUID later.

Should RESO be in the business of defining payloads? Payloads are sets of fields (i.e. open house payload). Pro? Payload compliance would lead to quick hassle free setup of data feeds. The con? Agreement on payload contents is problematic. RESO won’t define payloads – at least for now. If there is demand, RESO could set up a ‘framework’ for registered participants to define & share commonly useful payloads (similar to  XML namespaces).

RESO wants to know what your pain points are – for people to participate in the work group.

Certification & Compliance (Greg Lemon and Paula O’Brien)

IMG_20140424_114333_250Compliance is the implementation of a RETS standard – certification is a *demonstration* of your compliance. It’s a subtle difference.

To become certified, you will need to submit an application to the application department. The compliance department will create ‘raw’ reports, then the certification department will create a report card. Finally, asuming the report card is positive, the application department will provide the applicant a certification award.

The certification tools will be available to download, so technologists can use the tools to see what would be in the report card before the compliance department is even contacted. Greg Lemon can provide links to the code on Github.

Certifications in progress include RETS 1.8 / 1.7.2 (beta), RETS Client (beta), and RETS Servers (beta).

Having a period of re-certification is under discussion but (according to Bob G.) re-certification for existing, already certified versions will probably not happen. This issue came up in the context of preventing vendors from certifying and then making big changes without re-certifying. The risk of this is being balanced against cost and effort of certification.

More information: http://certification.reso.org

RETS 1x (Paul Stusiak)

This workgroup supports and enhances the existing set of RETS 1 documents, the compliance efforts around it, and implementing the new data dictionary in RETS 1x contexts. There is lots of work in progress, including a new change proposal (child records in update transaction) which will be added to 1.8, updating the DTD documents for 1.8 including data dictionary, and fixing the 1.8 document in the Confluence collaboration system.

The group needs more participation for reviewing documents and change proposals, understanding work group efforts that might impact your organization, and identifying and resolving inconsistencies with the RETS 1 standard.

RETS Web API & Security (Scott Petronis / Matt Cohen / Matt McGuire)

The RETS API will support a number of security mechanisms. Following are the recommended standards and their uses:

  1. HTTP Digest Authentication SHOULD be supported, as the easiest standard to implement which addresses the first and most prevalent use case for RETS, and which can be made to serve some other use cases as well.
  2. oAuth 2 SHOULD be supported as needed to support additional use cases, especially where three-legged authorization is required.
  3. SAML 2.0 Bearer Assertion Grant Type Profile for OAuth 2.0  – In environments where SAML is already in use, SAML MAY be used as an oAuth Profile.
  4. SAML – In environments where SAML is already in use, SAML MAY be used.

Basic Authentication over SSL would have been a contender, but RETS SSL adoption has been low and there are complexities and expenses that might make it difficult – so Digest was the recommendation for simple authentication at the end of the process.

The new transport will be a “RESTful” way of delivering real estate data. The goal was to facilitate web, mobile and social applications in a lightweight manner – and not to re-invent the wheel. This will make it easier for new developers and MLSs to implement. It will use existing standards – W3C, OData, OGC, and OAuth – and existing technologies such as HTTP, XML, and JSON.

Currently the standard document is complete. It was published and received public comment from November 2013 through February 2014. Revisions were made based on comments and questions, and version 1.01 addresses those issues. There was some minor clean-up (XSD and JSON updates) based on feedback as members prototyped the new standard. The document is in “lockdown” now with very minor revisions in progress – the goal is for the board to approve and ratify the document soon, so folks can implement.

The initial release is focused on metadata representation, read access (standard search), geospatial search, and hypermedia representation. It uses OData v3 and OAuth2. OUT of scope: create, update, delete functionality; a data replication framework; requesting binary media; updating binary media; saved searches and resources.

There are already three prototype implementations: UtahRealestate.com, MLSListings, and NAR/CRT’s. These will be demonstrated tomorrow.

What’s next for the group:

  • Create, Update, Delete
  • Binary media
  • Data replication
  • Saved searches, listings, etc.

Property Unique Identifier (Kristen Carr)

The all powerful Kristen CarrThe goal is to create and assign a unique identifier for real property (land and things that are attached to the land) in the U.S. and Canada. This would enable all parties involved in the transaction (not just the listing transaction) to follow the history of a property. Real property does NOT include vacation rental, business opportunities and mobile homes (that are not attached to the land).  The effort will focus on U.S. and Canada for now, but should allow for future growth outside these countries.

If the property as an APN (assessors parcel number) this fits into the algorithm for PUID.  Possible methods involve creating a database for assigning & storing PUID; using APN numbers; or an algorithm that ‘meets in the middle: (ISO 3166) + APN + Unit = PUID. FIPS was considered but even some of the U.S. government was moving away from FIPS and to ISO 3166. A PUID will only change if the base property changes, i.e. rezoning / subdividing. It would not change if something was rebuilt.

  • Obstacles might be:
  • Unstructured APNs
  • Unavailable APNs
  • Lack of adoption by other standards bodies
  • Difficult to implement
  • Low motivation

Rolling Out Data Dictionary Round Table

Rob (CRMLS): There are lots of ways to implement the dictionary. You can change your metadata on your existing database, you can transform your database to match the new standard – or some option in  between.

What’s the biggest issue with rollout? [Matt says: what follows is my attempt to summarize - not quotes]

Olga (MLSListings): Major issues are when we need to make schema changes to be 100% data dictionary. Like when you have one field in our system that maps to multiple fields in the dictionary. That’s hard when you have existing integrations.

Rick  (Metrolist Services): 70% of our data mapped fine. We had ten fields where we had enumerations where we needed the help of the vendor. We’ll have to do an ETL on the fly.

Richard  (Midwest Real Estate Data): Going from multi-selects to single is hard. We may request changes to the data dictionary.

Rick: That’s the best part of being on the work group – getting changes made.

Richard: It does take time, and you’ll have to get your vendor involved. Our vendor already has 50 hours into 2 property types and 100 hours left to do. It’s not a slight undertaking but it’ll be worth it down the line.

What do you think of core fields?

Olga: It makes it more realistic – it’s good to break things into stages. To be 100% compliant – I don’t see an easy way other than ETL. It’s too many structural changes otherwise. Too many ties to legacy data.

Rick: Core fields is the way to go. Phases are do-able. 100-120 fields will take care of a lot of the use cases out there. 100% is attainable too, but I don’t know how far it is to get to that.

Richard: We’re in the same boat. Core within the year – no problem. We’ll see what the rest takes working with our vendor. In terms of approach, ETL and metadata changes are both a possibility.

Rob: I want to change my metadata – I don’t want to transform forever. My advice is to figure out how you’re going to sell it.

Rolling Out the Web API

Scott (Onboard): We’re in a different boat – in the infancy of the Web API. While I’d like to talk about roll-out in earnest but there steps needed to get us there. There are technical challenges to roll-out and items beyond the technology that we should talk about. I’ll bring up some of those as we walk through this.

From the technology perspective, where did you go – the investigation and learning to implement?

Fred (UtahRealEstate.com): I played with NodeJS and found a library that had an oData server – I thought I was 90% done. So I thought. I started reading the spec – metadata discovery – that wasn’t hard. I thought I’d throw digest on this and be done and started reading through the auth spec. Then I realized I had to generate the authentication credentials. Mark Lesswing spun out a separate server that did that but I spun out my own security provider. Then I still had some metadata tweaks. I’d say we spent as much time mapping our data to the dictionary as we did for the new API. It still doesn’t have logging, rate controls, etc.

Ashish (MLSListings): We are an MLS but we’re also a software company with a software team. Because we’re all Microsoft, .NET, C#, oData was great for us. The question I had was whether I build the client first or server? We implemented something we can provide to our brokers, because it’s hard for them to build a RETS client. Long term we’ll think about vendors and start converting our RETS server to use oData.

Matt McGuire (Corelogic): I spent a lot of time prototyping. Because we work with .NET and C#, it’s a good fit. It does most of the work for us. I don’t think people want to have to research and buy a RETS client. The selling point is this service is there and the libraries for talking to it real time are there and browser applications can be built that way. Long term, that kind of ease of access – it won’t be as complex.

Scott: It won’t be overnight. You don’t turn a ship in a second. Beyond the technology and build, have you been contemplating what this means to the business?

Ashish: I don’t see it as alternate but as ultimate. The industry is changing – there are so many mobile devices. RETS (today) is bulky and difficult to understand. The future has to be oData – lightweight and fast.

Scott: Does it make sense to do things in step form or is there an opportunity to roll things out together – i.e. data dictionary and the web API? Or is that too much to bite off?

Matt: These things are separate but work together. They have to be maintained separately but work together. The groups must work together. We can’t come out and say this version of this, that version of that, etc. are a version of the standard. Because they change at different paces.

Ashish: We have to think backward compatibility. It won’t be a full switchover at any time unless you know all the legacy applications are ready.

Fred: We’ve been excited about this. We’re full steam on this. The quicker we get it out …

Scott: There will be a variety of approaches – full steam, toe in the water, and step back and wait. The technology is one piece. What about market education – communicating to the vendor community that it’s coming and providing education?

Matt: To some degree internally we understand what we need to do – between our business units. In terms of MLSs and brokers – no – I don’t know how I would couch that.

Ashish: Education – that was the first thing I thought out. That’s why we started ‘broker first’ so we could show them the power. Starting in June we’ll go out to the brokers and that’s a process we’ll continue and carry on.

Fred: Our membership is involved. Our BoD is excited and are up to speed. There are quite a few vendors we’ve talked about it to – they want us to roll it out. They’re ready and waiting for it to get done.

Scott: From a RESO perspective, what could we do regarding the education portion of this?

Matt: Technology knowledge-base. We’re building it without even thinking about it. Not just the spec but white papers.

Scott: We provide a lot of use cases in documentation – helping people better understand the “why”.

Ashish: Look at the market segments. There are technical people and business people. If you provide one knowledge base who are you targeting? You have to have materials for the technology people and examples for the business people.

Fred: People are going to wonder what this is going to cost. It’s three lines of code to consume – it can be running in a number of hours. The brokers’ lights go on.

Matt says: sorry folks, my hands couldn’t keep up with the lively Q&A :-)

Updating Listings with RETS (Pace Davis)

Update is starting to get more attention. There is nothing to be afraid of – update is not new! It has been around for 8-10 years, since 1.5. It has been supported by Corelogic and Black Knight, and MRIS. CRMLS has now set up update, and we have other clients doing it as well. Search is only half the system – update is the other half. Update increases the number of applications you can use. Brokers are starting to demand it. They are tired of error-prone duplicate entry. It’s great that some vendors already support it but we need more people to open up to it. It’s taken a long time to gain traction for both business reasons and technical.

On the business side, the business rules are complicated to create so add/edit was too hard to create. That was a valid reason but it’s a bit of misinformation – sometimes vendors don’t just want to take it on. They don’t want to open the door and sell another front end product. All systems in this industry come down to mapping and business rules. The RETS Update project centralizes your business rules – once you have the rules defined they are able to be read by other systems as well as humans – this is much more transparent, flexible, and efficient than legacy systems.

On the tech side, update server adds edit mask (# characters/type per field), update help (descriptive text), validation expression (BNF notation), and presentation (if you offer that – groups of fields, columns, etc.). If you really wanted to, you don’t need any of this though – you can send in an update transaction and read the documentation of what to do – you’ll just get back errors and modify your code.

Adapting to an API (Mark Lesswing)

Mark demonstrated open source software created by CRT:

  • RETS Web API Server (built in Node.js and MongoDB to front-end a RETS 1.x server!)
  • RETS Web API Reverse Proxy (built in Node.js – adds 2-5 ms. Digest is done, OAuth2 is coming soon)
  • OAuth2 “Authorization Code” Plug-in
  • Oauth2 Server (not released quite yet, written in Node.js)
  • Jaydata – Fork for RETS OData model
  • RETS Data Dictionary as a module

All the work is released under an MIT License.

Demonstrations (Paula O’Brien & Greg Lemon, Fred Larson, Ashish Antal & Olga Ermolin

Paula and Greg demonstrated the compliance tool, which is available for download.

Fred demonstrated his RETS API server.

Ashish and Olga demonstrated a wonderful implementation of OData usage, pulling data directly into Excel where it was manipulated to provide broker-level business intelligence, including some great looking charts.  This was a great example of how the new technology opens up all sorts of new opportunities.

RETS Web (and other) API Security (Matt Cohen)

I outlined the risks to API credentials – especially when used for mobile devices. Apps on mobile devices can trivially easily be decompiled and traffic sniffed for secrets. Apps that use app-level credentials instead of end-user credentials are especially at risk. Several ideas for security best practices were presented – but nothing really ready for public consumption. Sorry readers! My goal is to create a best practice document that data stewards (MLSs, brokers, etc.) could use when contracting for and/or auditing security in a mobile app. Feedback was requested from the group toward that end.

Closing Remarks (Robert Gottesman)

Bob asked whether we should have a plug fest and if so, what the format should be. There seemed to be support for the concept, but no decision was going to be made. This will be taken up by the technical committee.

Finally, he reiterated the mission of RESO: to improve data integrity and quality for every part of the real estate transaction.

 

That’s all folks!

Posted in Matt's Real Estate Technology Blog Tagged with:
7 comments on “RESO (RETS) 2014 Spring Meeting LiveBlog
  1. Gregg Larson says:

    Nice update, Matt!

  2. Imraan Ali says:

    Matt, please let us know how we at NuOffer can help.
    Feel free to email me.

    • Matt Cohen says:

      Imraan – you are welcome to join RESO (reso.org), show up to RETS meetings to learn more, and join work groups to participate toward improving the standard.

      All are welcome!!!

  3. Al McElmon says:

    Love the live blog idea!

  4. Thanks for your summary Matt. Hope to attend and participate in the October 21-23 meeting Chicago. Kevin

  5. Hi Matt, Thanks for the notes and sharing them with the community.

    The RESO Server Compliance Tester is available on GitHub at this link: https://github.com/RESO-RETS/RESO-Server-Compliance-Tester

    Any comments or questions may be submitted through the certificaiton website: http://certification.reso.org/

    Thanks, Greg