-
Notifications
You must be signed in to change notification settings - Fork 526
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
lifecycle document draft #254
Conversation
and some explanatory text
clarifications and labels
nit: governance/AssessmentLifecycle.png => please make filenames all lower case with dashes to separate words |
as requested in PR comments
and missing header
@ultrasaurus done! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm a bit confused about the CVE discussion here. Shouldn't all projects monitor CVEs that are in their codebase or libraries they use? Doesn't the impact of a CVE vary so widely it is hard to generalize if it is okay to ignore / defer them?
I haven't had the chance to give this a thorough read, but think a document like this will be helpful. Some quick notes:
Thank you! |
I thought annual might be overkill for non-security or non-control plane
projects. biennial is every other year (ie 1x per 2 years). but if we wish
to simplify and make it annual for everyone, I have no objection. that
said, in terms of volunteers I was hoping to keep the load lower. if we
try to schedule every project every year we would need a huge complement of
dedicated volunteers. That's why I thought 1x every 2 years made more
sense for lower risk projects.
As to CVE (or other discrete event) reviews, I hadn't yet suggested who
would do these or how this was to be done... for all the good and thorny
reasons Justin touches on :) This might be a nice role for some volunteers
who may not be able to commit to a huge chunk of assessment work but
perhaps would be willing to monitor projects for high impact events...not
just CVEs, for example I could imagine more mundane but significant events
also triggering at least a discussion if not a re-assessment, e.g. a large
version release of the project or kubernetes control plane, or underlying
specification revision or redesign of the logic, etc.
Will review and fix other style and format issues Sarah notes. I'm fine
trying mermaid, though I've not seen it before. for future revisions I'll
give it a whirl!
…On Tue, Sep 17, 2019 at 7:21 AM Sarah Allen ***@***.***> wrote:
I haven't had the chance to give this a thorough read, but think a
document like this will be helpful. Some quick notes:
- It would be helpful to situate this within the repo. Where would it
be linked from?
- Is the CVE review for the SIG or the project? The SIG doesn't have
the bandwidth to commit to reviewing CVEs in a timely manner -- we're still
trying to get these assessments to happen quickly!
- the diagram really helps highlight the content! (side note: I wonder
if we could use something like mermaid <https://mermaidjs.github.io/#/>
to have text representations of diagrams)
- the idea of biennial assessments is new to me. I remember
discussions about annual updates to the assessment and would like to try
that (since I think annually, many projects will have few changes and I
would like to see if that happens). In the text, it says "biennial or
annual." While we're waiting on formal guidance from TOC, maybe diagram
could use a different word like "Assessment Update"
- pls review writing style guide
<https://github.com/cncf/sig-security/blob/master/CONTRIBUTING.md#writing-style>
-- we're trying to keep repo consistent (noticed long lines, headline caps)
Thank you!
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#254>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAGENIU75C7ZHNUSULL7AT3QKDRVHANCNFSM4ILWME6A>
.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like the overall document, but I feel the focus on CVEs is a bit odd. Shouldn't projects be reviewing CVEs as they come out? Shouldn't we be responsive to issues that are critical quickly?
I can remove the mention of CVEs - was merely a placeholder or shorthand for "some discrete event of varying level of importance"
Yes, for CVEs they should. But does that mean they will? I guess in scope in my mind is a review of how good a job the project is in fact doing responding to CVEs and other events. |
@rficcaglia Any continued work on this? Should i add the inactive tag for this in the meantime? |
well I think it's teed up for a decision to be made:
* do we want recurring reviews after the initial assessment?
* if so, do we want them all to be at the same cadence regardless of other
factors?
* if not, what factors do we want to include: frequency of CVEs? whether
they are security component or not? project team size? sig assessment
backlog? other factors or trigger events?
* make a decision on a specific peridicity based on the above
I can codify those decisions into a revised doc. does anyone recommend a
voting tool for these sorts of consensus building exercises, or just folks
add github comments here?
…On Tue, Jan 21, 2020 at 6:17 AM Brandon Lum ***@***.***> wrote:
@rficcaglia <https://github.com/rficcaglia> Any continued work on this?
Should i add the inactive tag for this in the meantime?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#254>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAGENISGALCZWRMKW5AKDN3Q637XRANCNFSM4ILWME6A>
.
|
I think having it in the doc will be good and we can set aside some time on one of the sig meeting agendas to have additional discussion than existing on the PR |
I agree with @rficcaglia that making assessments annually for every project is currently too big requirement for our group. How often we do periodic reviews will become a really important question if at one point the TOC decides that the security assessments are necessary part when applying for Incubation, Graduation, etc. Another think we should keep in mind that it's possible for one project to be initially categorized as a I am wondering should we consider third-party audits in the scheduling of the security assessments? |
This issue has been automatically marked as inactive because it has not had recent activity. |
@rficcaglia @lumjjb whats the latest on this? |
I was thinking an annual review would not be a full assessment (unless features that affect security have been changed significantly or added). I attempted to describe that here: https://github.com/cncf/sig-security/blob/master/assessments/intake-process.md#updates-and-renewal |
@MVrachev belatedly answering your question about third-party audits, see: https://github.com/cncf/sig-security/blob/master/assessments/intake-process.md#intake-priorities with regard to CNCF audits "For future audits, the security assessment will be a pre-condition to the audit." I don't know if it is mentioned anywhere in the process, but I would certainly expect that if a project has funded their own third-party audit, that it would be referenced in their self-assessment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it would be good to go through this and open it up for discussion on one of the weekly meetings. For this PR, I do think that the scope of this document should be limited to the ideology of assessment cycles, leaving out the instruction of assessment cycle decisions... And for those, it should be deferred to the intake process document as pointed out by @ultrasaurus
It can be a separate effort to create some definition of how levels are assessed, something which is more quantitative or if qualitative, a process to determine risk.. i.e. review by assessment team, approved by assessment facilitator / co-chairs.
* Knowledge Transfers (Assessor KT; project team KT; community KT) | ||
|
||
# References | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what are the nature of the references to what is written, could there be some indicators/inline citations for the references?
* Review Scope | ||
* Update Process | ||
* Notifications (e.g. invalidating an assessment due to new attack type or critical supply chain vulns) | ||
* Knowledge Transfers (Assessor KT; project team KT; community KT) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: Specify (KT) as abbreviation before use
|
||
## Levels | ||
|
||
In considering how frequently to reassess, 3 levels of risk should be defined: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
suggestion: In considering the frequency of reassessment
![Assessment Lifecycle](assessment-lifecycle.png "Assessment Lifecycle") | ||
|
||
## Initial Assessment | ||
All projects should go through the same initial assessment so we have a consistent baseline. If 3rd party code audits or pen tests or vulnerability scans have been performed previously, these should be used as inputs into a complete security assessment. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add link to assessment guide for initial assesment
It should be possible to refresh the assessment incrementally unless the underlying project design or architecture changes radically. | ||
The scope of periodic updates should be limited in cases where there are low risk changes, or greatly expanded when security problems are identified by the community (e.g. a high impact CVE). | ||
|
||
From the perspective of the (volunteer) assessment team, performing a complete reassessment would likely be onerous - especially with if there is a queue of new project assessments and limited assessor bandwidth. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: Maybe more generically, from the perspective of SIG Security, since we have not officially defined an "assessment team"
I'd like to bring this into a working meeting. I see two major things here and this has been a topic needing to be added to the roadmap (@pragashj and i briefly discussed this)
|
+1 on working meeting
Some more color context, there were many discussions on what an assessment
was, is or should be and what we now have, X, is very different both in
intent and practice from the thinking that went into this draft.
An important contextual note is that at the time it was perceived by some,
perhaps incorrectly, that the assessment was a requirement for CNCF
projects before sandbox and certainly before maturation. This has been
clarified since.
Had it been a hard requirement I’m speculating that there might have been
more resources made available to create a community led “in depth” security
assessment (let’s call that Y) which could help projects meet this Y
requirement, including grants or “summer of security” type things.
If you go back to zoom recordings this was a subject of many calls. I’m
still a fan of Y but I acknowledge my sage peers who correctly identified
that a volunteer only effort would necessarily require a much more scaled
back approach, and many of their concerns have been validated by the past
year.
I think X has value; I still think Y has different (orthogonal) value and
others have expressed similar sentiments on recent calls. Both should be
refreshed Periodically IMHO, but X requires a lot less time and effort from
volunteers than Y. Also projects need to clearly understand the benefits of
X (or Y) so they can prioritize effort relative to other project needs. As
a concrete example CloudCustodian has spent 100s of effort hours preparing
the self assessment doc for their X. That is time that could have been
spent on features or bug fixing so we should respect that investment and
make sure X (or Y) has very clear, and substantial benefits, for the first
cycle and for refreshing.
…On Fri, Sep 4, 2020 at 5:58 AM Emily Fox ***@***.***> wrote:
I'd like to bring this into a working meeting. I see two major things here
and this has been a topic needing to be added to the roadmap ***@***.***
<https://github.com/pragashj> and i briefly discussed this)
- #415 <#415> which from
our discussion in the meeting this past Wednesday was that perhaps we need
to rename what we are calling assessments to be more clear. I know there
are lots of opinions on this, and related #394
<#394> brought this up as
well. (IMHO we should call these 'Evaluations')
- the renew on assessments/evals. We don't want these to get stale or
stagnant. in the past we discussed encouraging projects to collect a
listing of their security features which, if changed, effect it's ability
to be secure (things like access control changes, methods, availability,
integrity mechanism). it would allow teams to potentially add additional
scrutiny on changes to those security features as well as give the SIG an
actionable trigger to renew their previous evaluation.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#254 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAGENIXUXG7PB5ZBJOXU6KTSEDQAFANCNFSM4ILWME6A>
.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like this overall. I do wonder if the design churn of a project should factor into this. Some projects change significantly year-over-year, while others are quite static...
This issue has been automatically marked as inactive because it has not had recent activity. |
Let's revitalize this. @rficcaglia would you add to an upcoming meeting? I'd like to close this out, it is our longest outstanding PR has is more important as more projects seek graduation |
@JustinCappos what is the expectation at the TOC level? will there be resources made available and/or a community driven mandate to reassess every 1-2 years (as there is for example for Kubernetes)? will projects be required to do so regardless of resources? I if there are a) resources to encourage volunteers and projects to participate and b) a TOC authorization/mandate so that volunteers don't feel like the "bad cops" and instead are perceived as helpful to the projects achieving the required steps for TOC, this will go better. Just my .02 BTC. Also "resources" doesn't mean only 3rd party paid audits - could be training on review skills, funds for or comp'd certification exams, recognition in blogs or marketing, badges or personal recognition of participating volunteers and projects, recognition at conferences with presentation speaker slots, etc. can be very creative on the resourcing to align with participation incentives. doesn't just have to be piles of cash. |
My sense is that @justincormack may be better positioned to respond to this. Your second point is a good one though. Community buy-in is hard. We need to show value in proportion with the effort expended. |
This issue has been automatically marked as inactive because it has not had recent activity. |
@rficcaglia I'm looking at this pr for the first time. It has been a while since it was filled. Is it still something you want to revisit or contribute? |
I think we can leverage the K8s model and set the high bar to every other
year. 3 years was too long between K8s audits but every year is just way
too optimistic given the logistics and cost.
Maybe if you are not a big, mature (ie graduated) project it could be more
lenient and do every 3 years. Anything beyond that and one has to wonder
why anyone concerned with security would use something that is >3 years
since a security review. And if they are willing to look the other way,
then they clearly don’t worry too much about security.
Returning to my earlier comment, LF/CNCF paid for both the 2019 and 2022
K8s assessment. Even so, we scoped the most recent one way down and said no
to a lot of scope we would have loved to add but budget did not allow for
it. Those still need to be assessed by the volunteer community without
funding (every 2-3 years). I mean how secure is K8s really if your CSI has
vulnerabilities or CNI, or CCM, etc etc.
Oh and Admission Control plug-ins were not in scope…thinking of the recent
CVEs.
So in summary if we really want CNCF projects to be more secure, we need to
reassess them at least every 3 years, ideally every 2 for graduated. CNCF
needs to put some funds behind that goal *and* the community needs to
volunteer and maintain templates and threat models *and* companies using
these projects need to contribute staff time and/or funds to make that
sustainable.
It seems like a win-win-win to me except for the bad guys :)
If the TOC is seriously considering supporting such a model, I’m happy to
write up the recommendations and lessons learned from K8s and help present
a framework with the TAG chairs to the TOC.
|
I see what you mean regarding a regular cadence to ensure reports are current, although it could be clearer if you are referring to third-party audits like the one for Kubernetes versus assessments done by the TAG. For what it's worth, most projects that underwent security assessments had reviews between 2019 and 2021. Those projects have expanded in scope, deprecated and grown interfaces, and the repositories are now home to child sub-projects with separate code bases in some instances. Since not all projects share the same release cadence, number of major releases, or dynamic nature, assessing the delta in changes may be worthwhile since the last assessment before starting a new one. In the case of projects like Kubernetes, the differential is more obvious than in others. If the ideal outcome is for the CNCF to prioritize and budget for more frequent third-party audits, that's one conversation that can be had, which the foundation is receptive to, particularly regarding projects with a large end-user base; this is a recommendation that can be formulated but not much to do on our part other than presenting the case. If the idea, on the other hand, is to reconvene the assessment teams from TAG-Security every so often, I think between the initiatives of security pals and lightweight threat models, we can create feedback loops to gauge for "risk level" since last fully assessed and have those discussions. For now, can we move the security-assessment-lifecycle.md to a Google Doc for public comment from the rest of the community and make updates that capture the goal and reflect some of the other streams like the two mentioned we now have in place? We plan to discuss this during an upcoming call. It would be good to get feedback from the assessment facilitators and make it part of a workgroup with the aim of fluidity of reviews. |
This issue has been automatically marked as inactive because it has not had recent activity. |
Thanks again for putting together the presentation. The feedback was incorporated both into the streamlined assessment process in place by Justin as well as the security pals program. |
still WIP but given recent CVEs I thought it important to keep momentum by PRing what I have so far. Definitely NOT complete. feedback welcome.
re: #152