Introduction … And a New Leader!
Rebecca Jensen, RESO’s board chair started off the day – announcing new directors, Rich Lull, Russ Bergeron, Ira Luntz, and Jeremy Crawford. There were some re-elections: Steve Byrd, Rob Overman, and herself. The broker position is currently unfilled and RESO is hoping a broker steps forward. Also, Travis is no longer leading RESO. The board went through a search process – reviewing over 140 applicants. Robert (“Bob”) Gottesman was selected as RESO’s new Executive Director.
Bob took the microphone to lay out his role and responsibly: working with RESO to develop and implement policies, procedures, short and long-range strategic plans; administering finance, managing committees and work groups, finding and managing staff to assist work groups, and maintaining close working relationships with stakeholders. He is primarily following the lead coming from the board. His primary goal is to design, construct and deliver real estate standards as dictated by RESO leadership, to promote adoption of RESO Standards, and to grow RESO membership. Secondarily (supporting the primary goals) he will be defining value propositions, creating compelling marketing and branding strategies, implement a compliance program tied to branding, and understanding and tackling migration issues. Thirdly, to create a sense of urgency. Bob intends to keep committee and work groups motivated and productive – providing well defined goals and timelines, accountability, and closely monitor progress with metrics. [Matt says: Hooray!!!]. He intends to improve two way communications – reaching out to members, future members and influencers, keeping RESO transparent, and using social media and rets.org to transmit/gather information. He’ll also provide report cards every month – a continuous assessment and trying to figure out how to help a group succeed when they fall behind – also, allowing realignment when necessary. Bob has a good project management track record, he understands the mechanics of volunteer groups, will strive for consensus, and is not shy in expressing opinion but will bend to the majority. [Matt says: Great choice RESO! Good luck to Bob!]
During the Q&A, I suggested that Bob swiftly move to set “ship dates” for key initiatives and let stakeholders – especially RESO members – understand what is coming and when, putting these initiatives not just in terms of technology but business benefits.
State of the Standard
Bob presented “State of the Standard”. We’re targeting in on a ‘plug and play” environment – payload, transport, interoperability, UUID for easy identification and universal data fingerprinting, and tie in event models.
- Here’s where we are now:
- Data Dictionary: 1.1 ready for release.
- Transport: recommendation on transport foundation later today.
- Syndication: merge with data dictionary effort
- R&D: cleaning up backlog / focus on high value projects
- Compliance: Compliance and certification on RETS 1.8 in the works
- Authentication and Authorization: Chairperson acquired; seeking members.
Mark Wise presented on ‘Measurement of Success”. How will we report back to the stakeholders funding us? Using data dictionary as an example. Rob Larson published 1.0. Now 1.1 will be published. Is that success? No, it needs to be implemented. Is that success? No, it needs to be used. Maybe we set a goal – 50% of RETS servers using it correctly in two years. Or something more aggressive.
Mark Lesswing presented on the “Future of the Standard”. Right now we have RETS 1.x – good for bulk transfer standard; a syndication standard; a data dictionary, soon to be version 1.1. Next steps: making RETS 1.x data dictionary compatible and create a compliance program; convert syndication to a payload; a data dictionary: keep it moving forward. Long term: solve “real world” problems, move quickly, make the standard easy to use, and implement in a standard way. If not, RETS will lose relevance (to others). We have data dictionary, we must move quickly on transport (creating “real world” light weight transport: quick, small accesses). The event catalog – the key to plug and play – is next: what are things that people do that require MLS information? Example: Event Name (getHotsheet) Variables (Date – default to today if null), Response (from data dictionary). Other examples: UpdateListing, GetHotsheet, SearchUsingLocation, PropertyDetail, PropertySummary, RequestShowing, MakeOffer. Fast and easy – that’s the future of the standard effort.
Q&A: Compliance testing for 1.8 and new initiatives will be created – older standards will not have a compliance tool. The board and Bob also committed to providing “ship dates” for some of the bigger components … soon!
Chip McAvoy then talked about Compliance. The challenge is that there isn’t an official way to implement the standard (now that the old compliance tool is no more). RETS implementations need to modify the MLS system schema to match the new dictionary, modify a product database to match the dictionary and pull dictionary-compliant data from an existing RETS server. Implementing the dictionary via 1.8, it needs to be retrofitted in: CoreLogic created RCP for RETS 1.8 Metadata DTD, created an RCP for alternative standard name, and created a proposed schema payload for the data dictionary 1.x. Both RCPs were approved by the board on 5/21/2013. CoreLogic has already implemented this in their RETS Pro server and RETS Connector client – and also implemented this against their AggDB (aggregated database). Chip and Matt McGuire then explained technically and demonstrated how to implement the data dictionary in 1.8. It’s as easy in his connector as choosing what payload you get back and in what format you want it (including CSV) 🙂 Technically, they used the GetPayloadList transaction to list alternate payloads supported by the server. Then one uses GetMetaData to get the metadata for a specified payload. Then one uses Search, with a search parameter extension: StandardNames. (i.e. StandardNames=DataDictionary:1.0). There’s also a Search extension: Format. And another extension: Payload. Full documentation, of course, is on RESO.org. There was a little kerfuffle about the process for how these RCPs made it through without a specific person’s oversight. [Matt says: But I saw this go through R&D Workgroup, it went through the technical committee, and the board. Perhaps it could have gotten more community feedback – but I like this “work lean, launch fast, present to community, and tweak” approach – it ensures that we move at an appropriate speed without getting hung up the way we have in the past.]
Chip then went into the compliance plan – it’s been sitting on the shelf for 3 years. Compliance checker has been on hold while the new stuff was worked out – but that’s what all the current businesses are worked on. So, we need to re-engage the plan. What’s being discussed now: The Hitachi tool is hard to keep updated. Does it make sense to build out that tool for what we have in front of us today? Maybe not. An alternative is to not automate compliance testing with a tool – but rather to set up a common set of test suite – a regression suite – and hire a third party to run the tests. That would allow us to test for compliance in 2012. That will be discussed by the compliance workgroup. There would still be reference implementations to test against.
Research & Development Workgroup
Paul Hethmon then got up to discuss R&D and invited Matt McGuire to talk over an RCP for Geospatial Search – something already reviewed by R&D Workgroup. Many, many highly detailed scenarios with regard to that RCP were discussed. Then Paul put the big spreadsheet of all of the items before R&D at the moment and we reviewed many, many items – including Activity, Push Transport, Open House payload, etc. [Matt says: I think people didn’t need to see this kind of sausage making – there are just too many ideas being thrown against the wall for evaluation to get real feedback in this forum. If folks want depth and want to contribute to the business case evaluation process, they probably need to join the workgroup. In the future I think it would be more productive to demonstrate the process we use for prioritizing items and let people know what has made it through and what is actively in process. Just my two cents. “Activity” payload was almost ‘killed off’ during this session, until I explained that that was intended as the glue between Contacts and Saved Search that forms what we refer to as “Prospects”. Then everyone got it…]
Paul Stusiak gave an update about Syndication standards. RESO has been asked to align the syndication schema (for which there are 120 consumers) and came to the conclusion that it doesn’t make sense to substantially modify what’s out there and working. So the plan is to wait for data dictionary adoption and then re-build based on the data dictionary a new syndication schema, perhaps later this year.
The Data Dictionary
Rob Larson gave an update on the Data Dictionary. There’s a lot of work going now to improve the Data Dictionary – additional enumerations, eliminating redundancy and managing breadth, and improving extensibility.
The dictionary workgroup has a change request procedure – please submit a name, definition, type, length and justification to the group. So far, improvements over the past few months for the Data Dictionary version 1.1 have included resources for: member, office, media, history, contacts. There have been 18 change requests approved and rolled into the dictionary – and there are more to review. The plan for Data Dictionary version 1.2 (to be complete by September 2013) includes open house, commercial, saved search, enumerations (the possible field values in many areas), and any other add/change requests. There are lots of details to work through – is “Caravan” or “Association Tour” a separate class from Open House? Feedback is that Showings are certainly in a different class. Metadata submissions are due by June 15th. On the commercial front, the question is whether RETS should provide a full Commercial payload? Or a “Resimercial”? RESO has been coordinating so far with OSCARE. Greg (RMLS – Portland, OR) suggested NOT dumbing down Commercial standards. Rob said, “It’s not dumbing down” as much as providing a standard for what we actually seem to be using in our MLSs but doing so in alignment with the OSCARE standard. Mark Lesswing suggests we have both. I say: we should do that – Resimercial is a subset of Commercial just like Syndication is a subset of Data Dictionary. Metadata submissions for Commercial are due on June 15th. Saved Search: we’re essentially looking at identity of who has created it, an identifier (saved search name), some tie to a client or activity, and some way of representing criteria. Matt: there may be a need for “when changed” and those types of management fields. Should they be in DMQL? Matt McGuire says there’s no way around that. What about groups of saved searches? What kind of content is being requested – Data Dictionary? This will take some community contribution. What is saved search? Is it the criteria? Or just an ID that goes back to the RETS server to get the results? That wouldn’t support all the use cases (i.e. MLS migration). So, both methods could be explored. Metadata submissions for Saved Search due June 15th. Regarding enumerations, we need submissions! Please look at the spreadsheet and contribute – but please limit to your more heavily used items, and put good definitions around your enumerations. Go to the google group (groups.google.com/d/forum/reso-data-dictionary). Email firstname.lastname@example.org. There are monthly calls for data dictionary. We are thinking of a July meeting in Los Angeles at CRMLS.
Scott Petronis, Clark Endrizzi and Matt McGuire reviewed the RESO Transport Workgroup progress. What we want to accomplish: a “modern” transport standard – something based on RETS, lightweight, that reduces the amount of data replication, that better controls the flow of data, that provides better access options for the “common” developer, and that does not require a massive effort for all vendors involved. This is being done to be able to more easily access data rather than replicate data (RETS 1.8’s strength) – something good for environments like mobile. A year ago, what was being considered: OData – “Let’s not re-invent the wheel, let’s adopt something with legs.” Since then there was a lot of discussion. Other options emerged and existing APIs departed from OData.
The group is still proposing OData for transport. What is OData? OData is an application level protocol for interacting with data via RESTful web services. It’s designed for the problem area we’re focused on. Initially it was very focused on ATOM/XML and was originally driven by Microsoft. Now it has been taken up by OASIS [Matt says: just like SAML for SSO!].
There’s still debate and we need to get to agreement:
Mike Wurzer opined that OData wouldn’t cover all the cases RETS requires and extensions are not that easy. Matt Lavallee responded: “OData Version 3 is a radical expansion that covers many use cases including geospatial search – and now version 4 is coming out and certainly encompasses what we need for RETS and much more.”
- Already a working standard
- OData Filter and Query support is comprehensive
- Encapsulates search and syndication in one
- No one has to write their own query parser
- Supports XML (ATOM) and JSON
- Reference implementation exists
- Validation/compliance tools available
- Data model agnostic
- Support for .NET, Java, PHP, Ruby, Oracle, SQL, MySQL, nodeJS, etc.
- Build for what our industry does
- Primary driver was Microsoft (not an issue anymore)
- Some key limitations, namely geographic search (still an issue in v3?)
- JSON support still maturing
- Some departures from REST principles (like most standards)
- Other, competing standards like GData (GData may be on the way out)
- Any choice has pros and cons
- Every approach taken to date makes compromises on REST principles
- No matter what approach, the community will have work to do
- OData is not perfect, but it’s a solid foundation we can work with
The scope of what Transport wants to accomplish is:
- Query (Search)
- Not Create, Update, Delete to start
- Focus on key resources
- Extending to other resources is a “roadmapping” exercise.
- Transport will standardize how to search by individual data types (how to search by string, number, list, lookups, etc.)
- Pagination is defined in OData, use that functionality
- URI will have ../reso/complianceVersion/property/
- Media accessed via hyperlink
- Endpoint defines the payload (example)
- Can also do select statement to limit to specific fields
- Use functions to implement views that you want
- This helps with things like
- implementing payloads
- implementing saved searches
- implementing saved select statements (like “short cuts”)
- Geographic search (point plus radius, polygon functions)
- Seect and sub-select (get statements – various payloads like mobile)
- Lookups (can be handled by “collection elements”
- Missing some “convenience” factors (using any and all solvs this)
- Compliance – need to dig into OData compliance tools.
The group is still looking into solutions for how to handle each challenge and has some ideas for each. Next steps after that would be to look into additional resources: tax, history, contacts, events, statistics, etc.; Additional use cases such as syndication and replication; Additional geographic search types (lines, buffers, multi-polygon, etc.); Getting a connection with the OData team to help promote our agenda (but we can’t count on this).
OData is a starting point. It won’t be a panacea. But we have more (non-industry) standard tools to use as a starting point with OData than without it. It can be put into place alongside RETS 1.8, providing support for key features for use cases the older transport doesn’t support easily. Then, we can step into the more complex uses.
It was requested that, if there is consensus on this, the workgroup document the use cases and provide a technical document describing how OData would work.
There was a call for a show of hands regarding whether we had enough as a group to decide to go forward with OData to write and discuss on – or go back to look at alternatives. This would not mean that OData is confirmed as the new transport.
Michael Wurzer re-iterated that he wanted his Spark API adopted.
Rob Overman suggested that we look at proof of concepts for each of the options before making a decision.
Paul Stusiak suggested the group show reasons why OData was better than Spark or MRIS’s API.
A show of hands was made – the Transport group will continue exploring OData and also provide insight to the community regarding its selection and other options, if any.
[Matt says: I think it’s important to note that choosing a non-proprietary and non-real-estate basis for transport was a key selection criterion. The idea behind that is that there’s enough terminology and fields to learn for developers coming into our industry, and that picking a transport developers may already know, and for which there is a lot of support and existing tools that they may even have learned to use in school, would lower the barrier to entry for new innovators. That and other criterion really made it a choice between OData and GData, and GData seems to be on the way out. My opinion is that we should trust the Transport Group and their months of in-depth work on this. That said, like Paul, I’d like to see some documentation of the reasoning – how they got to this point. “Trust but verify,” right?]
Since we didn’t end up having break out sessions on day one, our morning plan to recap those breakouts wasn’t needed, so it was decided to have the workgroups summarize yesterday’s progress and take motions if needed. Then Bob’s plan is to talk about what we hope to get done in the next six months.
Compliance: RESO is soon going to put through a compliance process for RETS 1.8 (1.8 ONLY – not earlier versions). [Matt says: it’s time for MLSs to transition to 1.8 to stay in compliance with NAR policies re: RETS compliance.] RESO is going to give you tools to get a certification – but it won’t be automated like the Hitachi tool. Later there will be a badge you can put on your website. Certifying compliance with data dictionary will come later. And, once the new API is published there will be compliance for that as well. Ed Newman offered to give Bob a requirements document previously developed for a compliance checker – this had been used to solicit sub $100k bids to build a version-independent compliance checker (never built). We discussed what could be fully automated (server functionality) and what could not be automated, or be only partially automated (dictionary compliance checks are much trickier). Bob committed to providing a status update to the community in June.
R&D: The group will create a survey to the community to prioritize items on R&D’s plate and get that out soon. The R&D Workgroup has been designated the gatekeeper for changes – so if you have things you want to get done join the workgroup or submit items to Paul Hethmon. R&D is a sieve of ideas – to give ideas from one organization wider exposure and see if there’s wider support. Item that pass through R&D will go to the technical committee, then the board. This process is being discussed though and may be changed. The packet to submit ideas is on reso.org. Chad Curry (CRT) will send out a link.
Data Dictionary: Workgroup members bring up ideas for changes and additions to the dictionary. Those go into a list and come up in the workgroup. That brought us from 1.0 to 1.1 Draft. We go into a public comment period, then incorporate what changes we can – but only urgent changes are approved. We don’t want it to go out for public comment then change 20% of it! We got through most of the public comments – marking things “Yes and urgent”, “Yes, but can wait for 1.2”, or “No”. That’s all out there in the Dictionary’s Google Group. Version 1.1 is now with Chad and it’s up to the board to adopt it. We do have changes that have come in since the document went out for comment, and we’ll work toward getting those into version 1.2 in December. I’d like any comments by June 15 so we can discuss items during our Dictionary meeting in July. Then we can create something for public comment, and the process continues from there. Yesterday we decided not to create a new structure for “Caravan” but rather would make that a type of “Open House”. On Commercial, we would work with OSCRE where we could. We would create a subset and stay aligned with the other group, but we woudn’t try to duplicate their entire standard inside RETS. It was asked if commercial sheets should be submitted. Rob asked that, if possible, it be put in the Dictionary format (spreadsheet) but “sheets” would be acceptable. The framework for ePropertyData and Exceligent CIE – that hybrid may be submitted to the group for consideration as well – but there may be IP rights issues that would have to be dealt with first – since they are not members of RESO. Saved search – we discussed the scope of that and asked that MLS vendors help with that via a technical subgroup. We also need to solve some technical issues in the dictionary, with regard to repeating elements – expect some change there. That work will start in the next week. We do want MLSs to contribute enumerations as well. Remember, there will be a July meeting in San Dimas at Rob Larson’s place, so get your ideas in early June!
Syndication: We have some cleanup work to do in existing Schema, then a meeting in June. Then we will go dormant until people start implementing the data dictionary.
Bob G. brought up the new Authentication and Authorization Workgroup – the first step is to figure out what to report. Bob’s looking for members for that group.
Transport: Scott P. first talked about a need for a better way of sharing info. We’re going to get an initial proposal for what the initial version will look like on the table ASAP for the community, so they can see we’ve done our work and get feedback before moving forward. OData will be explored to make sure it does everything needed. That’s over the next month. Email the board or committee if you want to participate – calls are the first Monday of every month, though we may meet more frequently for the near future. In-person meetings may also be planned. We’ll also try to get some external expertise regarding OData. Matt M. summed up the criteria: The MRIS API and Spark API is very specific to the MLS and MLS data spaces. Since RESO needs to interact with other aspects of the real estate industry (Corelogic and others have that need certainly) so we chose not to go with those two so we can go into those spaces as a group. Mike W. said, “We were going for the 90% use case with Spark. We can put an OData wrapper on it.” Scott P. says, “We’ll learn from what you have accomplished since you are a member of the committee.” Mark L. says, “Everyone should play with RESTFUL API tools and OData to get familiar with them.”. At the next meeting we won’t just have a proposal but sample code so people can see what this will be like to implement.
Technical Committee. Chad Curry (Chair) reported: We’ve revamped how the tech committee works. It used to be just board members. Now it’s comprised of chairs and vice chairs of the workgroups, plus a chair and vice chair from the RESO board. We know we don’t have a perfect process, but we’re working on it. We’re going to work on creating a more transparent process. Monthly workgroup reports will go on the workgroups, and Chad will email out executive summaries in the future. The committee has been pushing for more action – creating milestones. We’re implementing Trello as a project management tool. It’s online and also has iOS and Android clients.We’ll also get a technical writer. We recently had a planning session and there were the following action items.
- Expand data dictionary – we’re making progress on that. 1.1 now, and 1.2 in the fall
- Publish an API for security/transport – we’re in progress with two workgroups working on that.
- Develop a workgroup for integrations (I/O) – being discussed
- Collaboratively create real property definition – We’ll be defining what that means in the tech committee. That is tied into the Universal Property Identifier
- Engage in a conversation re: Universal Property Identifier.
- Commercial. Dictionary is looking at that at present.
- Hire a tech writer for transport and dictionary workgroups. We’ll do that.
- Document an API (obviously that’s dependent on the API being defined)
- We also need to define a migration path
Chad will send out a more comprehensive list to attendees.
We talked a bit more about the need for a single place to manage all the progress. Will Trello have SSO from RESO.org? Chad promises to improve our process.
Bob G.: Come on up to talk to me. Please participate in workgroups – the success of RESO is in your hands!
Share this post: