[This Transcript is Unedited]

Department of Health and Human Services

Subcommittee on Privacy, Confidentiality and Security

September 16, 2013

National Center for Health Statistics
Auditorium A and B
3311 Toledo Road
Hyattsville, MD 20782


P R O C E E D I N G S (4:28 p.m.)

Agenda Item: Welcome

DR. FRANCIS: Okay, now we need to talk about our follow-on. I guess we should say that we’re now in business as the Privacy, Confidentiality and Security Subcommittee and we would like first of all to talk about –

I guess actually in terms of the order of this, just simply to make an announcement that all members of the Privacy, and anyone else actually who would like to let us know that they’d like to be part of this, on the 30th of September there’s a virtual hearing on accounting for disclosures. I have, and would happily send around a draft copy of the questions that were the subject of discussion earlier today.

I think one of the most important things that I might just fill people in on is I filtered some comments from committee chairs and from Walter and it’s very clear they’re going to carefully distinguish between disclosures, internal uses and access internally. By our pushing the question, members of the Tiger team actually realized that half of them thought it was only going to be about disclosures. And the other half thought it was going to be an accounting for any access. So I think we’ve already played an important role.

MS. KLOSS: How much opportunity will there be to ask questions, discuss – do you have a sense of how –

DR. FRANCIS: Hopefully a lot and we are all registered as full participants. So that means all the members of our Privacy, Confidentiality and Security Subcommittee. But if anybody else wants to be hooked up with the Altarum numbers, it’s a virtual hearing. So to the extent that there will be, and the plan is to have quite a lot of time for discussion.

People are being held to very short presentations, as I understand it. I don’t have the full list of who’s going to be actually presenting. I have a partial list. It’s very thin on the privacy side, at least in terms of what I saw.

There’s a possibility that if we don’t think that there’s enough information about accounting for disclosures that we will also be able to participate, I mean to have our own hearing later. But that’s up to us. They’re fully aware –

I thought either Sue McAndrew or is there anybody else on the phone – oh, hi Sue. I was going to say I thought that you were going to be calling in from OCR. So maybe you might want to just say a little bit about what OCR is hoping to get out of this.

MS. MCANDREW: Sure. One of the things, once we got passed the omnibus, that we wanted to turn back then to closing the loop on the rule making on accounting for disclosures and to come up with potential ways of addressing the statutory obligation that would either allow us to go forward with some sort of access report or to test out different ways of meeting that statutory requirement.

And so we were beginning to do a touch base with the various stakeholders in terms of any evolution, I guess would be the best way to put it, of concerns or issues with regard to how to meet that statutory requirement. Obviously, we did not get a lot of positive feedback from the NPRM.

In touching base with ONC, we agreed that one of the things that we should take a careful look at again was where technology was in terms of what it would support. Then some time had passed and with the rapid pace of technology evolution that perhaps that was a good place to start to see how things might have changed since the landscape had changed.

So we did some limited testing of the waters so we are viewing this as fact finding because we are still technically within the rule-making relationship with the – we still have the NPRM out and active. And so we did that and it became apparent that a broader expansion of our fact finding was necessary and so we decided to try to engage the advisory committees at ONC to help us have a more public discussion about the technology.

And once we started down that path then it became apparent that we really should not just be focused so much on technology at this point but make this a more transparent update on all the stakeholders involved in trying to find a solution for this piece of legislation.

So that’s where we are. We’re hoping to get out of it either indications from the stakeholders that they are where they were at the time of the NPRM or that since then they have tried something new and this is how they are now viewing the world.

I don’t know – we’re hoping to find from consumers again what they are looking for as a means of a solution. What kind of information really is important to them at this stage now that there’s more blue buttons, there’s more electronic health records out there. Many would have more direct experience than they might have had even 2 years ago. So we are just looking for additional information to help us find out where everybody is today in this arena so that we can begin to assemble that information to find a better path forward.

DR. FRANCIS: Thank you. That is very helpful, Sue, because it helps us think about what kind of information we might be helpful in making sure gets surfaced. So maybe I’ll just open it for a few minutes discussion to members of the committee and others here just to comment on what sort of questions they might like to ask or see get asked or hope will get asked at the virtual hearing.

DR. SUAREZ: Yes, this is Walter Suarez with Kaiser Permanente. I actually have seen some of the draft of the questions that were proposed before and I am concerned about a couple things.

Number one I think this is a very large group of people that are going to be attending live because it’s not only the Tiger team, which has about 50 members but also this subcommittee and also the Health IT Standards Committee Privacy and Security Workgroup which I co-chair. And so there’s going to be a large number of people all interested in asking questions. So that’s one challenge.

Number two, this is a 5 hour total when you add up the total it’s about 11:30 a.m. to 5:00 p.m. I believe schedule which includes also some I suppose a couple breaks. So I don’t think that – and there’s four different panels testifying. I’m concerned that we’re certainly not going to be able to get into the details of the issue that we really need to hear about.

And then thirdly, I think there is significant confusion, I think, still out there about, and it’s expressed by what you just said Leslie at the opening, about access, use and disclosure. The word access is confusingly used across the various questions and across some of the comments that are being made in terms of – access people think is access from someone outside, and outside means something very concrete in the HIPAA realm.

So I try to emphasize as much as I could in my suggestions to clarify questions, the importance of making sure that people, even within the Tiger team and certainly members of our group here as well as the Privacy and Security Workgroup of the Standards Committee, the importance of making sure that people understand the difference between a use, a disclosure and then the confusing use of the word access because access can be done for use purposes or access can be done for disclosure purposes or under disclosures. So I think that is a major area of confusion.

And then lastly I think it’s, in my mind, the most significant direction in terms of finding the input that we all want to try to talk about is distinguishing between health information privacy policy which is one level, health information technology privacy policy which is the application of technology within the realm of privacy, and then health information technology standards.

And I think we need to ask questions about the overarching issues around health information privacy policy and not in the context exclusively of health information technology. In other words, even if the technology allows for certain capabilities, the first question is is that the right policy? Or it is not? And so I think we’re going to, in my mind, see a lot of questions across the board on is the technology capable of doing this? Can the technologies tell the color of the eyes of the person that is seeing the record and record that and then document that in some access report?

And I’m concerned that technologists are going to say, oh yes, we can do that or if you want it, we can develop a system to do it. Is that the right policy? Should that be the right policy?

So in other words, my point is I think it’s a very short timeframe, 5 hours, for a very significant topic, accounting of disclosures with three major different scopes of questions, health information privacy policy in the first place and then these other questions about the technology side.

DR. FRANCIS: So, thank you. Sallie.

MS. MILAM: Hi Sue, this is Sallie Milam. I don’t know if you can hear me. This is Sallie. I echo Walter’s concern. In discussions with folks generally about protected health information and in any PII, people don’t understand or differentiate use versus disclosure. I’d like to suggest – and the risks behind use are different than disclosure. The actors can be different and the controls are different.

We know employees are our biggest risk but we have a variety of other risks that attach to use and disclosure and I think we would get richer conversations if we could ask questions, if we could group them, tease them apart. And so have questions focused on use, have questions focused on disclosure so we understand and do our best to facilitate a discussion so that we get the difference in why a patient may want information about a certain use or a certain disclosure and then the impact to the facility.

Later on the draft questions there is an interest – and I understand the questions have changed – but one of the goals in an early draft was called goal three, gain a greater understanding of how record access transparency is currently working by health care providers and I guess I’d like to delve into that a little bit further just from the health care provider standpoint, either covered entity or a business associate standpoint.

I’m wondering how often health care providers or their business associates pull audit logs now. I am wondering if we have a underreporting an incidents now. If providers are even aware of all of the unauthorized uses and disclosures are occurring.

So I’m concerned that we need to maybe get a better handle on this and ramp up supports for our HIPAA-impacted entities so that they are prepared to start sharing this information with patients who will be identifying incidents. Do they have strong incident response programs? When a patient comes to them and they adequately call out an unauthorized user disclosure, will they know how to deal with it?

From where I sit and what I’ve seen, it’s probably true that most organizations are not proactive in looking at their audit logs to the extent that they should be. And so I’m wondering if we need more questions to sort of call out how to support the provider and plan community to get ready to have patients delving in to this kind of information.

MS. KLOSS: Yes, this is Linda Kloss. One of the areas I hear about repeatedly is that while a provider organization might have good controls on the primary process of disclosing information in response to an authorization, what’s happening more and more, because of decentralization of process, is that disclosure is occurring in pockets throughout the organization.

There might be disclosure through the emergency room or through a department and outside of the scope of the individuals who may in fact represent that the process is working well. So I think that there are a lot of levels of complexity to this and that the primary EHR technology may not be the only area of vulnerability or functionality that’s of concern because it’s happening throughout organizations especially if there is less central control.

DR. FRANCIS: I am going to ask Maya if she’s got a good list of some of these concerns and could send it around to all of us so that – as well as to the chairs of the virtual hearing – so that people know that these are some of the fact questions we’d like to make sure get surfaced during the hearing.

MS. BERNSTEIN: I’ve been taking detailed notes.

DR. FRANCIS: I know, I’ve been seeing – Sue, would that be helpful? I’ve been trying to figure out a process that would enable some of this discussion to be efficiently used either during the hearing or afterward?

MS. MCANDREW: Frankly, I will have to say I have not been working directly with the leaders of the Tiger team on this hearing. One is because I’m not going to be here when the hearing is held and two, I did want to have some arms-length relationship with regard to their ability to control the mechanics, I guess, of conducting this hearing, largely to preserve our position on some discussion which may be more policy in orientation than fact finding, which can be done but really needs to be at the behest of the team, mutual kind of third party venue.

DR. FRANCIS: From what I am hearing is it would be very helpful for you to have us focus on fact finding rather than policy. So for example, how frequently audits are generated is a fact question, the question Sallie raised.

So I gather from Walter that the draft questions have already gone – or the questions have already gone out to testifiers so we’ll make sure they get sent around together with Maya’s notes and –

MS. MCANDREW: It is perfectly legitimate for either you guys or the members of the Tiger team or of the other committee to delve into the more policy oriented type of questions. It just given the rulemaking status, we cannot convene a group for that purpose.

DR. FRANCIS: So I have another question, while you’re – are there any other comments, first of all, about the upcoming virtual hearing? Okay, I think we’re set on that.

So while you’re with us, Sue, are there some issues that you’d like particularly to call to our attention, anything with respect to the end of HIPAA or other data privacy issues that you think it might be relevant for this subcommittee to be looking at? For example, valuing costs of disclosure, damages from disclosure?

MS. MCANDREW: I do think that we did have this one final part of the HITECH Act that we will need to do rulemaking on. And it certainly it is at a very investigatory stage and that is the sharing of CMP for settlement amount collections with “harmed” individuals from disclosures and so Leslie and I have had some discussion and I did share with you what the AO had to say, right?

DR. FRANCIS: Yes, you sent it to me just because I was interested in the question. But how to figure out the whole issue of civil monetary penalties and sharing them is something Sue and I talked about that might be relevant.

MS. MCANDREW: So we would be more than happy to the extent that there is interest and time for the subcommittee to really help us in that area. It is, as you noted, one of the fundamental questions is how do you value, so how do you quantify harm and how can you value that, particularly in an administrative claims type of process.

So while there are these emotional distress and other kinds of tort remedies and values placed on harms, the question is whether the process and documentation to prove those levels of harm are in any way appropriate for this setting. Or is there something else that can be looked to as a way of identifying what kinds of harm should be represented in here and then how do you put a dollar value on that.

I think the other – I really don’t how much this would fall on your bailiwick with the other aspects of this that we would be struggling with – is the best mechanism for the delivery of these funds, is it some sort of trust fund that would be set up at the federal level? Is this something that would be better placed back at the entity? Is this something that we should be considering hiring a mutual fiduciary to administer? How does this actually get determined? What kinds of models might be out there for the actual administrating of the set-asides to compensate individuals?

DR. FRANCIS: Fascinating questions.

MS. MCANDREW: It’s a very interesting – it sounded so commonsensical, I guess, when it first appeared in the statute but then trying to figure out how you would actually do this has gotten into areas that we just are not at all familiar with.

DR. FRANCIS: Thanks. So that’s something we’ll put on our agenda for discussion about how we might wrap our hands around it if we want to.

We’ve got about a half an hour and that’s it. The other issue that we need to talk about is our ongoing collaboration with Population, Health. And Sue, I’d love it if you’re willing to stay on, partly because I have this question. As we look at what some of the elements are of the stewardship framework that we might particularly want to develop, several have sort of risen to the surface.

One is all the issues about re-identification. Another is the issues about transparency and accountability. And another one is the issues about protection after data have been released through data use agreements. Those are just three of the areas that we’re thinking about developing some discussion of the issue and case studies that might be useful for people who get the data.

Now, a lot of this is non-HIPAA. Some of it is data that gets shared through a data use agreement that is HIPAA. So – I guess the floor is open. I’ve given the quick outline of where – as we’ve had committee discussions – where we are but committee, where to go?

MS. KLOSS: I think that short list also was kind of reaffirmed by the extended group that were interested in these topics from the roundtable. So it wasn’t just subcommittee. We didn’t figure it made sense to tackle all eight of the areas of the stewardship but to focus on those that would be most relevant.

MS. MILAM: To bring in a conversation we had with Paul over lunch. Paul was asking the question shouldn’t some activities maybe just be made illegal or certain things be made as required and I’m thinking back to a couple of years ago I think when a number of – in the non-health care scenario – a number of large private sector organizations talked about getting away from old consent models – they had a catchy phrase, moving to rules and tools and away from something that meant consents and notice that were hard to read and hard to follow.

So I’m wondering if there’s a way to make data sharing more open without the complexities but having it clear when you do the wrong thing, there are actionable penalties.

And another issue that Paul had was struggling through contractual negotiations, all of the fine print, you work hard on having a good business associate agreement and then it could be undone if you’re a small practice or a provider without a legal team right on hand, signing contracts that essentially unwinds the BAA that you’ve put in place. So those were two issues that Paul brought forward that I thought were very real and I know a challenge for many people today.

DR. FRANCIS: Maybe a summary actually that’s relevant to having Sue on the line is Paul sees it as about effective enforcement and I guess some of us – well, effective enforcement is clearly useful, critical, but you need to know that there’s been a problem before you take – and I don’t know how good we are at picking that up.

So where do subcommittee members want to go? In other words, subcommittee members want to go home. We drew the short straw, we’re the last ones. We get the first slot next time.

How would people think about the following challenge? If you wanted a case study that was a good case study about accountability, what would it look like – do you know of an example where data accountability, data steward accountability was done well? We would need an example where transparency, particularly with respect to data repurposing, was done well, if indeed it can be done well. We would need an example of community data use that raised the questions of either small area issues or mosaic effects with data aggregation and how that was done well and we would need an example of a good data use agreement. How a data use agreement was done well and followed up on well.

MS. BERNSTEIN: And this is a hypothetical case study, right? Well, when you say a good data use agreement that was done well, you mean there had to have been a problem with it that was – some provisions were used to keep someone accountable. That something went wrong and that the data use agreement anticipated it.

DR. FRANCIS: Or that this was a data use agreement that anticipated the kinds of things that might go wrong, provided for them and then followed up to see whether or not they happened. I know of almost no cases of reported violations of a data use agreement. But I don’t know whether that’s because there are none or because nobody knows whether there have been any.

MS. KLOSS: Are we doing this in the context of community data use?

MS. MILAM: So it’s non-HIPAA. They are not a covered entity.

DR. FRANCIS: The community, no. But they might get data from a covered entity under a data use agreement.

MS. MILAM: Or a data use agreement from a non-covered entity. Most data organizations at the state level or lower don’t have the resources to put in to auditing data use agreements so I don’t know of many who have gone out proactively to determine whether or not all of the security and privacy protections that are in the agreement are actually being carried out. I don’t know where we would find that.

MS. BERNSTEIN: The IRS and NSA. The people that have significant resources, the IRS asserts, for example, that when it discloses tax return information, it can go and investigate. It doesn’t do it with everyone but it randomly goes and looks or it takes a sampling of that and that’s – there’s a sampling of the safes in the classified world and of the other provisions like that. But they have significant resources to do that. And not many other communities have those kind of resources. They’re spending it on the stuff that we would probably find more interesting. So a lot of that is built on trust though, which is reputation and past track record and so forth.

DR. CORNELIUS: I keep hearing the idea of hypothetical as opposed to actual as I’m hearing this conversation. Because let’s take the IRS example. If I was on the other side, the recipient of this problem in the data use agreement, and now my example is being uplifted and being processed and shared, I don’t know if I’d feel comfortable with that.

MS. BERNSTEIN: For security reasons, you mean, that your security would be – you certainly don’t want too much information about your security methods out there, right.

DR. CORNELIUS: Let’s say I am Lee Cornelius data user. IRS has pursued me. I didn’t really do my job so that’s a business issue between the IRS and I. But now, for the purposes of this committee, this becomes a real life example, that’s a different issue.

MS. BERNSTEIN: I understand. I probably didn’t speak precisely enough. What I mean is that the IRS, on a regular basis for those who are complying or not complying, they sample the people who they do business with as part of the auditing process and that’s great.

DR. CORNELIUS: It is not just so much about the IRS, it’s the hypothetical versus the actual. The more we talk about this the more I would recommend a hypothetical case as opposed to an actual real life scenario.

DR. FRANCIS: Right, thanks. Bill.

DR. STEAD: Again I am having trouble really knowing how you’re scoping the problem. The central fact of the matter is that today’s information technology does not have the – is not architected to build in protection for security in large part and definitely for privacy, where in privacy we’re actually trying to match up peoples’ preferences.

And so I think in large part that’s probably Paul’s plea. If we balance, everything’s about balance. If we had a balanced approach to putting the people that misuse information in jail while trying to protect the banks, we would better off than thinking we could solve the problem totally by trying to protect the banks instead of trying to put bank robbers in jail also. So I believe that’s Paul’s point.

In this kind of situation, you can find cases where given that data protection and therefore data use agreements are relatively well matched. So you take an example from Vanderbilt where we have a de-identified extract of our electronic health records matched up with DNA specimens in an opt-in opt-out model that has been reviewed federally and said not to account, not to be personally identified research data. Despite that statement, our data use agreements do not allow use without the collaboration of a Vanderbilt co-investigator, not because we’re trying to get credit for research but we want someone we can fire if they do something wrong.

So that’s an example about how you achieve this balance. I wouldn’t describe that is perfect, good, anything other than an example of achieving the balance. So I think you can get some of those examples as long as you recognize that we are in fact operating in a world where you’re really trying to balance from the perspective of the person that releases the information, risk versus an effect.

DR. FRANCIS: Thank you. I think we recognize that and that’s partly why we want to have some – even though Vanderbilt, for example, would not be identified, but we want to have some examples of the kind of thing you could do that does a good job with that balance.

MR. SOONTHORNSIMA: Another way of looking at it – and this is actually is probably a dose of reality here – because if you really think about all the context or the different tools that we use today, like security, privacy, data use agreement, all of these things are I think a lot of them are meant for deterrent than anything else.

Now short of sort of the techniques that you use today to do password, ID and so forth, and even data encryption, even data encryption, short of those things, the rest are – I’m not sure they are one deterrence measures because no one really, I don’t believe people really go behind the scene and start auditing these things. And that’s a reality. Because I’m not sure any organization, large or small, have enough resources to really go behind the scene and say, are really destroying the data at the end of the day, are you really – once the data use agreement terminates.

So to come back to your question about let’s come up with sort ideal situations, I think we have to sort of balance that with here’s the reality and maybe what are we really trying to go after. Are we trying to advocate, oh no, you guys have to really go out and audit, when in reality there’s really no resources to do that? So, I don’t know, I’m just bringing it – maybe go after what deterrent mechanism might be more realistic for large or small organizations.

DR. COHEN: So my question is what communities, under what circumstances would communities want this information. How many communities use this kind of information and do we anticipate that more communities will?

I’m really trying to think about my experience with communities using data on some communities having limited data agreements with the state for primarily aggregated data, but mainly de-identified surveillance data in very limited situations. And when we talked earlier about community readiness, I think a huge issue is what are the capacities of communities to understand and use these data which needs to be addressed.

DR. SUAREZ: I think – you wrote down balance here – I think the reality is all this is done within the context of risk management. That’s what this is primarily all about – is analyzing and assessing the degree to which there is a risk for a particular event to happen and then putting in place mechanisms to address that risk.

That’s what security is all about really and that’s what HIPAA calls for in the 42 specifications for meeting the HIPAA security rule. Is an evaluation of risk and a formulation of the strategies to manage that risk.

And if I am going to write a data use agreement and give data to someone, I’m going to implement certain mechanisms to manage the risk that my organization is going to be exposed to if there are any problems with that data being done by that entity that received the data. If I see that the risk is enough to justify me going inside the organization and having a thorough evaluation of their security capabilities because I’m concerned about the risk than I’m going to do that.

So now bring that to the reality of communities and community data initiative. I think that is something that is not talked about, is the concept of risk management within the context of community data initiatives. And I think that’s what we can bring in in partnering with public health or population health and really defining risk management framework for community data initiatives.

DR. FRANCIS: So just to sort of generalize from that, maybe one of the really helpful things we could do is identify some of the risks that communities might want to think about and what are some of the ways to handle those risks.

MS. KLOSS: I keep trying to think about concrete deliverables. You’ll be happy to know that, Larry.

After we had our first privacy community meeting, we came away with the idea that maybe it’s time to revisit the 2009 stewardship primmer and develop, redevelop that primmer for this out-of-HIPAA context. And I think that’s what we were doing when we developed the framework, knowing that that just went so far and we needed to flesh it out.

I think where our subcommittee continues to struggle is while we think that would be just really a valuable thing and you could do it in a checklist or a series of principles and assessment practices, we need staff on the ground to do the research and drafting of this. It’s just not going to happen with the volunteer subcommittee members.

So I think that would be a valuable work product and a logical next step to what we’ve done. But we really would need to figure out how to get it done.

DR. FRANCIS: I think it is fair to say that we have wonderful staff help but we don’t have – we have not had anywhere near the same level that either of the other subcommittees, just in terms of straight-out time.

MS. GREENBERG: I’m not going there. But as I have said many times when this has come up – I don’t think I’ve ever turned down a specific request. So I need a specific request, and particularly if you know somebody or can suggest somebody who would be good to contract with to do this.

This might be a great job for your intern, particularly if this is a person with a JD and MPH.

But in any event, they’re coming to the National Committee already. But anyway – I’m not trying to get off on the cheap – I mean it. And I’m still here for 2 more months so grab the moment.

MS. KLOSS: I really want to see us move this along because I think we’ve got something that is a valuable starting point.

DR. FRANCIS: I’ll talk with Maya about, and other staff members, about what might make sense here so we can make you –

MS. GREENBERG: Obviously, the co-chairs and Maya, the subcommittee, would need to be very involved. But you would want to be. Scope it out and conceptualize it and all that. But I understand, none of you – this is a job.

DR. FRANCIS: We can’t do the environmental scan and the writing.

MS. GREENBERG: No. So the offer is on the table.

DR. FRANCIS: Thank you. We will take you up on it. Sallie and Linda and Bruce.

MS. MILAM: Just very quickly, what I think might be helpful to communities is to how to assess their own privacy requirements. They’re getting data from a variety of sources. They’re collecting it. They’re getting it from state level, from federal, arguably from the health information exchange, from facilities. So they have a variety of requirements, some contractual, some from law, some from HIPAA.

So how do they start? If we had some sort of a matrix that gave all of the possible privacy requirements out there, I think it would be a starting point for them so they would – it would identify for them what they need to be compliant with.

MS. GOSS: And to that point Sallie, I think adding the flavor of the different models that are really prevalent within the variety of states setups is important. I know within Pennsylvania, we’ve established a certification program which has the rules of the road for breaches, incident management, how we’re all going to play together and then we’ve built a trust community to further those discussions in this very federated environment, which goes back to your trust.

But, let me tell you, it’s a moving target because every time we think we’ve got a good framework, we start to think about something new we need to add.

MS. MILAM: So just building on to Alix’s really quickly. The matrix needs to reflect is there a standard that’s the most stringent. So you don’t have to understand all of the others at the end of the day, you just need to comply with the most stringent. If that achieves your compliance with the rest, as it may in Pennsylvania.

DR. FRANCIS: So Bruce and Vickie.

DR. COHEN: I want to build quickly on Sallie’s point. I think we should try to identify communities that are rich intensive data users already and then evaluate their privacy and security practices. And start from something that exists rather than try to create a theoretical model. Because there are communities out there that are heavy data users and that would be a real example.

MS. MAYS: Again, I want to build on what both of you are saying because that – Sallie, what I was thinking is that to some extent we’re kind of bringing up the enforcement issues and I’m concerned that these communities will get caught between a rock and a hard place because they don’t even know what they’re doing.

But I think instead if we do what you’re saying, which is it’s more on here’s a model, here’s a way and that what you do is to have – okay, here’s what the rules are, here’s what you need, here’s the kind of person that can help you to do this. Because not all these communities are going to have it so you want to point to contact such and such organization or contact this office in the federal government.

Because I think we want less of they’re going to get in trouble about it and more of a how do we help you to ramp up. I think you even want to say to them can you start putting this little line for this type of person in your grant proposal even though it’s like $20,000, you need an information officer or you need – so I think it has to be that concrete if you’re going to really drop down in terms of the level of the community.

Or else what we’re going to create is people are going to get caught in some kind of enforcement thing and they just had no idea even. So I want to see this more on the side of it’s instructive, it’s helpful and every time they go into box A to figure out what they need, there’s a set of resources that can range from – and we talked about this before – things like a YouTube instruction on here’s how you do this and here are places where there are forms, all the way to here’s the office in the federal government that can help you or here’s the office in your state to go to.

DR. FRANCIS: Thank you. Final wrap-up comments. I think Linda and I have a whole lot of next steps ready to roll so stay tuned. Thank you all very much for lasting until just about 5:30 and for all the helpful remarks. And Sue, if you’re still on the phone, thank you so much for joining us.

(Whereupon, the subcommittee adjourned at 5:25 p.m.)