General Observations
The Multistakeholder Strategy and Strategic Initiatives Department (MSSI) developed a survey which was sent to the Competition, Consumer Trust and Consumer Choice (CCT) Review Team. The objective of the survey was to gather feedback after the Review work concluded.
...
The survey allowed respondents the opportunity to provide additional comments about improvements, if any, to the ICANN organization support. Below are the comments received from respondents. [All comments appear as submitted, without editing.] |
The assistance of the MSSI Team cannot be overvalued. Special mention of Lisa Phifer |
Management of the overage timeline |
The CCT Review Team would have gotten off to a quicker start if there had been clearer guidance in the beginning about what data was available and what data was missing with regard to the research areas. It took several months for the CCT Review Team to figure out where data may exist within ICANN (or elsewhere). ICANN Support Staff was fantastic but some ICANN staff members were not very responsive to CCT RT data requests in the beginning. Krista Papac |
It would be helpful for ICANN org to be able to figure out how to encourage broader participation from participants. We were really slowed down by being stuck with just a few people contributing the vast majority of the work. |
Budget: |
The most egregious problem was the availability of relevant data and information. Some of that is currently not available in the ICANN organisation but with third parties. Data sharing arrangements with third parties should go a long way to cure this problem. |
...
Number of Responses | 1 (least) | 2 | 3 | 4 | 5 (most) | |
---|---|---|---|---|---|---|
How effective was the CCT Review Team at providing status updates on a regular basis to the SOs/ACs and the broader ICANN community? | 8 total | 0.00% | 0.00% | 025.00% (2) | 25.00% (2) | 50.00% (4) |
How effective was the CCT Review Team at incorporating feedback received from the ICANN community? | 8 total | 0.00% | 0.00% | 12.50% (1) | 37.50% (3) | 50.00% (4) |
How satisfied were you with the duration of the overall review and the overall effort of the review team? | 8 total | 25.00% (2) | 0.00% | 25.00% (2) | 50.00% (4) | 0.00% |
How satisfied were you with the application process? | 8 total | 0.00% | 0.00% | 12.50% (1) | 62.50% (5) | 25.00% (2) |
How satisfied are you overall with the conduct of the review? | 8 total | 0.00% | 12.50% (1) | 12.50% (1) | 62.50% (5) | 12.50% (1) |
...
Comments on the assessment of the CCT Review Team. Below are the comments received from respondents. [All comments appear as submitted, without editing.] |
A core group of review team members did nearly all of the work. Some members did not do more than dial in to calls. This was a bit frustrating during some of the more onerous parts of the review. However, the core group worked together really well and the broader group was always collegial and genuinely cared about the issues. |
There was a core group that did most of the work, a second tier that contributed from time to time if nudged, and a last category of folks who did very little. This was somewhat frustrating. |
This took way too long, and it was hard to stay in sync with the community over the duration (there was much more engagement early on). |
We should have organized around our budget and around community priorities better perhaps so that fewer items were OBE (overtaken by events) in the course of the review. The review perhaps took too long which was also a function of prioritization of activities, particularly outside research that could have been happening in parallel more often. |
Additional Comments
Comments on how the review was conducted (what worked well, suggestions for improvements, etc.?), are below. [All comments appear as submitted, without editing.] |
---|
Not to divide the review into 2 separate parallel mini reviews |
The core group really did a heavy lift to ensure that the commitments of the review team were fulfill and a data-driven assessment of the new gTLD program occurred |
There was a separation between the information sought prior to the review started and the information that the Review Team itself thought would be useful. This resulted in a very inefficient use of resources, and extended the time needed for this review. Ideally, the review team would be involved from the start of the process to asses what studies would be useful to carry out its mandate. |
We got a good start, but eventually were extremely bogged down with very little progress in the second year of the review. We probably just should have accepted some limitation in scope and done what we could on the first pass. |
It's noted pretty clearly in the final review document but the review is sorely lacking in data in a number of important areas.Data has got to become a bigger priority inside ICANN generally. |
Success of these reviews rests on information access and sharing. I find that the substantive analysis rests on discussion. Those discussions work well and advance more quickly to consensus when team members are face-to-face. |
...
What improvements, if any, would you suggest to make the construct of Specific Reviews more effective? [All comments appear as submitted, without editing.] |
---|
More data, sooner. Also, it is important to continue to allow for independent experts to be appointed. The CCT Review team was fortunate enough to have an economist and cybersecurity expert because they were not required to be appointed by an official ICANN community. |
Is there a way to assess whether members will actually do the work? |
Probably don't allow an mid-stream addition to the report, and force fit to some timelines rather than allowing the desire for more information to extend out the duration of the review. |
1. Earlier involvement of board caucus group to access priorities |
Information access and sharing, especially with third party entities are always going to help the team to move more rapidly to conclusions. |
...