Please report any problems to the Shared Tools Team at st-help@doit.wisc.edu    Broken Links? Missing Macros? WIKI Retiring Plugins
Child pages
  • IT Policy Forum 2017-10 Discussion Results
Skip to end of metadata
Go to start of metadata

Group A: How should we make collaborative decisions about network and system monitoring?

Initial points:

  • minimal connection with central IT.
  • some colleges have collaboration within the college (cals, ls)
  • some group members see connections through tech partners, campus meetings as a tie-in.
  • some group members approach departmental support with questions about system and network monitoring.
  • some group members expressed that they'd like to know more about what's going on.
  • wondering what monitoring services are available to consume as a user.

Desired: 

  • delegated access - let me look at what is going on.
    • requires a dashboard.

Suggested:

  • regular emails with the state of your network components
  • don't want to have to have to visit a website for info - have it come in alerts.
  • shared database of logs/dashboard.
  • shared IT catalog; expertise in other areas.
  • having a 'preset' or recommended security guide
  • service catalog would be nice
  • toolset access and training
    • differences in tool access are difficult - delegated vs. non-delegated.

Communications thoughts:

  • use listserv
  • need a way of communicating back to patrons
  • need a way of communicating out to decentralized staff (better contact infrastructure)
  • transparency
  • advisory committee - could be the place that IT staff can take questions about cdm to get it off their plate. 

What sorts of decisionmaking?

  • brief discussion of cylance as a place for decision-making.
  • mitigation - do we have permission to work on the computer.
  • what sort of scanning is allowed/available
  • transparency - what data is stored, scanned, etc.
  • what triggers are used for closer examination of above.
  • principles for privacy.
  • recommendations for starting to implement CDM. 
  • communicating out to decentralized groups about monitoring services
  • collaboration with centralized monitoring.
  • governance/decision making.
  • discussion area for questions about continuous monitoring/diag. IT staff (or faculty?) can bring questions to the committee.
    • Who runs the tools/who purchases tools/how the tools work
    • discussion of upcoming tools. identifying needs for new tools. information gathering for gaps.
    • tool life cycles
    • site license management.
    • defining responsibilities for decentralized and centalized IT - ex. firewalls.

Can this group do anything if a unit is ignoring a monitoring tool or task? (no)

  • MUCH of this comes down to resources. This committee probably couldn't do anything about that.

Final points:

  • what can decentralized staff do to monitor? what can central it do to monitor?  how are we sharing that?
  • define recommendations for monitoring/securing network and endpoints along with what risk is being mitigated?
  • define principles behind monitoring decisions.
  • communications go through committee.

Group B: How do we assure that privacy and academic freedom are respected as we monitor for vulnerabilities and threats?

Many aspects of the Continuous Diagnostics and Mitigation are not well understood:

  • The nature and prevalence of the threats/attacks we see.  
  • What monitoring means.
  • Who does this kind of work and the nature of the work.
  • What is examined and what is not examined.
  • How network traffic; data on disk, etc. are analyzed.
  • What we are trying to protect.
  • What can be protected.
  • What our responsibilities and liabilities are.
  • What the impact on individual faculty/staff/students could be in the event of a successful attack.
  • What the impact on CDM operations is on individual faculty/staff/students.  How does this affect them in their research, work, etc.
  • What are the responsibilities of individual faculty/staff/students related to lowering our potential exposure to data loss, PI theft, etc.
  • What does one have to do to meet these responsibilities?

A few principles:  

  • Transparency in the security space is critical for awareness, and for trust.
  • We should favor approaches that reduce what people are required to know and to do.  (Block phishing sites rather than expect that people will reliably identify phishing attacks.)
  • Do not add to faculty and staff workload.

Ideas for raising awareness:

  • Define what we are protecting and what it is possible to protect.
  • Define what we are monitoring and what it is possible to monitor.  (We see traffic that is on campus and VPN, but we’re not watching traffic in someone’s home.)
  • Expose some elements of the analysis of a detected threat.  Would a clearinghouse for this sort of information be useful?
  • Real academic world examples can motivate learning about security.
  • Provide short audio podcasts (with familiar WPR voices) that people can listen to during their commute.
  • Security protects individual’s privacy from violations by an attacher.
  • How attacks, if successful, could affect the University’s reputation.

Ideas for raising trust:

  • Adopt a professional IT code of conduct for staff working in this space, and make clear to staff and the university community what the consequences of violating this code of conduct are.
  • Describe the separation of concerns/duties among staff in this area and what can be known or is visible to these staff.  Describe who can look at what kinds of data/traffic, and who cannot.
  • Describe what is in bounds and out-of-bounds in terms of what data are analyzed, and how data are being analyzed.
  • Provide an independent feedback mechanism to address issues related to privacy, independent, that is, of the group responsible for performing the monitoring and analysis.
  • Invite people to come in and watch (within the bounds of privacy, of course).
  • Note that while monitoring is not looking for violations that fall outside of computing security, if in the course of investigating a security issue, a suspected violation is discovered, e.g. a staff of credit card numbers on a personal device, security staff will refer the issue to appropriate authorities.  But note that we are not looking for workplace decorum or productivity infractions.
  • Describe what is logged and what is not logged.
  • Describe governance.
  • Note how privacy and academic freedom are principles observed in CDM’s design and implementation.

Ideas for increasing motivation:

  • Use analogies, e.g. identify theft.
  • Support for helping faculty and students know what to do when travelling abroad.
  • Speak the audience’s language.
  • Describe benefits of being a good digital citizen:  1) reduce the likelihood that work is stolen; 2) reduce the likelihood that work is lost.
  • Slogan:  “we worry about it so that so don’t have to.”

Ideas for increasing the likelihood of a successful implementation:

  • Understand how faculty/staff/students use their devices.  Does this vary by population?  L&S vs. med school faculty, for example.  
  • Operating principles and processes in this space should be codified and communicated so that they can be adopted more broadly.  
  • Determine how much do people need to know/should be expected to know.

Miscellaneous notes and observations:

  • Faculty reactions to identify finders focused on performance issues.  No privacy issues were raised.
  • Agents on laptops?  APM Traps, etc.  We are a ways away from there.  Mitigation vs monitoring.

Group C: How do we create a partnership between local IT professionals and the Office of Cybersecurity?

Partnership definition:

  • Partnership means collaboration between Cybersecurity and local IT to work on solutions or tools to be shared and developed with bi-directional information sharing and open discussion that anybody can participate.

Building good partnership for all levels:

  • Cybersecurity needs to understand and recognize that there are different support needs on campus such as academic, research, clinical, and administration. Some departments have large IT group and others are one-person IT so partnership for all levels are needed.

Communication:

  • In order to establish good partnership, Cybersecurity needs to be sharing information about strategic campus directions so that local IT departments do not need to spend their limited time and resource on tools and requirements that will go away soon.
  • Best way for Cybersecurity to share information about campus strategic directions, service offerings, decisions, etc. with local IT departments may be combination of the followings: Email, Wiki, Website, Office 365 groups, and Box.
  • There is no good contact list to be used for notifications. Maybe each department has to have email address like departmentxxx-security@lists.wisc.edu so that multiple contacts at departments can be notified from Cybersecurity.
  • Cybersecurity must be using “signed” email when communicating security notifications. Some students from CSOC using first name only and no digital signature for notification email. Those email are deleted since they are not signed. Since departments educate users not to click on any email links if it is not digitally signed. Students need to be using Cybersecurity CIO email with digital signature when they email to communicate to departments.
  • Level of impact can be clearer. Departments need to know how much time and resources are required to do new initiatives.
  • Departments prefer to hear about other departments’ success stories rather than Cybersecurity sales pitch. Cybersecurity could use more success stories to expand security services.

Privacy:

  • Cybersecurity needs to understand that there are departments such as Medical School that have restrictions on communication tools and data (e.g. HIPAA). And needs to recognize that relationship between those departments with grants, IRB, and HIPAA Privacy Office/Officer needs to be improved so that they avoid misunderstanding on requirements and policies.

Data/Information sharing for benefit:

  • Other departments may be able to benefit from other departments’ incidents. Information needs to be anonymized before sharing. Postmortem of compromise, layout of the steps to mitigate, how to test to find the same security holes to fix, etc.

Net types of partnership:

  • Cybersecurity to co-lead with UW-MIST members to initiate/introduce new security services/tools.
  • For one-person IT departments, someone from Cybersecurity needs to come in and do the work first. Then educate the IT person so that he/she can perform on his/her own.

Others:

  • We need to be working on “Baseline” project again.
  • Security education/training service is also needed to educate departments end users on how to handle data. (e.g. when employee off-boarded, left a box full of SSN information under his desk)

Suggestions to resolve Resource issue:

  • Cybersecurity needs to be providing concise reports so that local IT does not need to spend lot of time to read them to figure out what needs to be done.
  • Annual review/check-in by meeting with departments to check how they are doing.
  • Better guidelines on security products is needed to understand what to use on what purpose. (need a better security tool box!)
  • Annual security picnic with free food! (so that local IT will be energized)

Group D: How do we build-in bi-directional communications to exchange timely information on vulnerabilities, threats, solutions?

Initial points:

  • How do we identify the most appropriate?
  • How do we route the request to the person?
  • Is this a technical question? Is this a policy question? (BOTH)
  • We’ve been doing this for a while, what do we need that isn’t working.
  • Tech partners hasn’t been sufficient, has been a manifestation of some of our problems.
  • The question of Who? should get this info rapidly? Right now, by the time someone sends something out, it’s very late.
  • Should be a mailing list - as a starting point. And we should have more specific contacts.
  • Network contact information is already available - why isn’t this being leveraged? Many person to person contacts, not using different group’s ticketing systems (which would be most efficient).
  • It comes down to trust - need to have frank discussions to arrive at good solutions.
  • Security communications issues are also the same problem from IT on campus as a whole - the same bi-directional communication do not exist in a robust state.
  • Many contacts are not really viewing their jobs as "security", when it is actually a responsibility of the person who is required to patch a system.
  • Professors and researchers have a different view on risk. They are very flexible in dealing with it, and have a very rational economic reaction: they will only spend time on mitigating something if it really will save them time or have a large risk in the future.
  • Boundaries have gotten fuzzier over time - ownership of wired networks was well defined. Wireless access doesn’t put them in the same buckets.
  • Laptops are not always managed, it changes the ability of central security to "know" who to contact.
  • If this is a bi-directional communication goal:
    • Need: Cross-referenced directory of security contacts. Could start on one side (ex: compromised host), and map to someone who could help.
    • Need: Every group must have a designated security contact
    • Many strategies to achieve the goal of cross-referencing security contacts. AANTS, Manifest group of "people I care about",

To summarize:

  1. There is real-time information about existing issues (Active Exploits, or Confirmed Vulnerabilities) and
  2. There is info about newly discovered issues that May cause issues in the future, but...
    • Ex: Send off info about compromised NetID’s, and never hear back if something was done, what was done, etc. Also, not currently bi-directional info from distributed to central, and not from central to distributed
    • Users get the info about compromised NetID, but departments only get information IF the user mentions it to them.
    • Existing known issue between central services: compromised services have a hard time, non-automated notification to other central services, ALSO, must be broader out to other distributed system.

Idea: Could use an API, Pub/Sub to get info about

  1. Find a security contact for a Person or Computer
  2. Notification to service providers across counts. Shibboleth helps get us very far, but doesn’t cover everything.

Some current gaps:

  • Not Bi-Directional, actually many to many
  • Currently lack of transparency with notification information. Need something more than "Fishy" going on, but no logs or info about what they thought they saw.
  • Need a structure for ALL IT professionals to partake in this process. Currently would easily miss folks who were responsible if they are not as actively engaged in the IT Security community.
  • Needs to include part-time security professionals, and student positions that have high turnover.

Recommendations:

  1. Each part of campus has a point of contact, who can at least direct this information, OR is the person who can take action
  2. A way to get broadcast notification mechanism about known issues in real-time
    • Information should not be filtered by DoIT communications
    • If you want to encourage Bi-Directional communication, but both parties need to see the value in participating from this communication. If an event looks like I just need to "comply", I will provide a superficial level of information to meet that compliance. Currently, not a high level of value received by providing a high level of information back up to central
    • Compliance level of communication is not one of our desired destination. Additional information
  3. Deal with pro-active information about vulnerabilities that might not be exploited yet. How do we make it easy for the non-good citizens as well?
    • DoIT is doing a certain level of watching network, what level of things are patched? What is actually being looked? Lack of transparency about exactly what is being watched? (ex: border watching means what?), And a way to tell so everyone isn’t reinventing the wheel.
    • No security tool catalog. We’ve got some tools, but don’t know where all of them are.
    • Out of scope, somewhat: how do new IT professionals in a small department get hooked into IT Security Community on campus?
    • Recent information about a current issue + info about current issue, and reputable links to how to remediate, etc. These have been good.
    • For Bi-directional to work should have both parties believing they are receiving value from participating in this. History has been, more one directional, unfocused, not valuable to all parties, and this erosion of trust.
    • How to get people to have less work to do, not more.

Key Points:

  1. Lots of conversation about the compromised host or credential
  2. Pro-active management of vulnerabilities that may not be exploited yet.
  3. Communication - For Bi-directional to work should have both parties believing they are receiving value from participating in this. History has been, more one directional, unfocused, not valuable to all parties, and this erosion of trust.
  4. Cross-Referenced directory of security contacts.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

  • No labels