|
Application Experiment title
|
Proposal Template for
CloudFlow Open Call 1 (CloudFlow-1)
|
Partner table
Partner no
|
Partner acronym
|
Partner name
|
PIC1
|
Organization type
|
SME
|
SME status confirmed by EC
|
1
|
Partner 1
|
|
|
End user
|
Yes/No
|
Yes/No
|
2
|
Partner2
|
|
|
|
|
|
…
|
…
|
|
|
|
|
|
N
|
Partner n
|
|
|
|
|
|
Coordinator
Name:
Affiliation:
Phone:
E-mail:
Abstract (5 lines)
Keywords (max. 5 keywords)
Keyword 1, …, keyword 5
Table of Contents
Table of Contents 2
1 Industrial relevance (max. 1 page) 1
1.1 Description of the current process
(engineering and/or manufacturing) 1
1.2 Description of the envisaged process in a Cloud environment 1
2 Experiment design (max. 1 page) 2
3 Technical impact
(Section 3 and 4 together max. 1 page) 3
4 Business impact 3
5 Innovation and complementarity (max. 1 page) 4
6 Exploitation (max. 1 page) 5
7 Technical approach (max. 2 pages) 6
8 Work plan (max. 1 page) 7
8.1 Activities 7
8.2 Milestones 7
8.3 Deliverables 7
9 Resources committed (max. 1 page) 8
10 Consortium (max. 2 pages, max. ½ page per partner) 9
Annex 1: Individual Assessment Report Form 10
Call information
Identifier: CloudFlow-1
Call title: New application experiments for CloudFlow – 1st call
Project full name: Computational Cloud Services and Workflows for Agile Engineering
Project acronym: CloudFlow
Grant agreement number: 609100
Call deadline: 30. September 2014, 17:00 h (Brussels local time)
1 Industrial relevance (max. 1 page) 1.1 Description of the current process
(engineering and/or manufacturing)
Which task(s) is/are addressed? How do you solve it/them today?
What are the current limitations and their consequences?
Which types of data and formats are involved? Which tools do you use?
Which computational resources are you using? (From desktop PC to HPC infrastructure)
How much human effort and computing time does it require?
1.2 Description of the envisaged process in a Cloud environment
What are the suggested improvements and potential benefits?
2 Experiment design (max. 1 page)
What are the driving questions for the experiment? What do you want to know/prove?
How will the experiment provide evidence and answers to the driving questions?
What are your performance indicators?
How will you measure them?
Please answer and discuss the above questions for business models AND technical aspects.
3 Technical impact
(Section 3 and 4 together max. 1 page)
End users:
|
What is the technical impact of the experiment as a whole on your application?
|
ISVs:
|
What is the technical impact of the experiment as a whole on your software?
|
HPC center:
|
What is the technical impact of the experiment as a whole on your infrastructure?
|
R&D partner:
|
What is the technical impact of the experiment as a whole on your technology?
|
What do you think is the technical impact of your experiment on the CloudFlow infrastructure?
4 Business impact
End users:
|
What is the impact of the experiment as a whole on your business?
|
ISVs:
|
What is the impact of the experiment as a whole on your business?
|
HPC center:
|
What is the impact of the experiment as a whole on your business?
| 5 Innovation and complementarity (max. 1 page)
What are the innovative aspects of your application experiment, and how does it complement the existing ones.
Some examples of innovative aspects:
-
enable end users to access computational Cloud engineering services not yet used by them
-
allow simulations of more complex models for developing better products / for improved reliability assessment and compliance with requirements or better predictability of product behavior ("design for X" and simulation/optimization)
-
enable/support complex computational engineering services and workflows in the Cloud enhancing the interoperability of data and tools
-
others
Please explain and discuss.
6 Exploitation (max. 1 page)
How are the results going to be exploited during and beyond the lifespan of the experiment?
How are you going to continue the partnership of this experiment after the end of the project?
How do you like to scale up after the end of the experiment to include other users, countries, etc.?
How will the experiment results (incl. software) be available to CloudFlow after the end of your experiment?
7 Technical approach (max. 2 pages)
Explain how you want to implement and run the experiment.
-
What are the building blocks of your solution?
-
Which changes do you plan to perform on the building blocks, e.g. modularization and cloudification of your software?
-
How are the building blocks related, and how will they be integrated?
Please describe the technical approach, the corresponding activities and involved partners hinting to the tables in section 8.
8 Work plan (max. 1 page)
Please fill the tables for the activities described in Section 7 introducing milestones and deliverables.
8.1 Activities
Activity No
|
Activity name
|
Lead participant no.
|
Person-months
|
Start
month
|
End
month
|
|
|
|
<#>
|
<#>
|
<#>
|
|
TOTAL
|
|
|
|
|
NB: There is a mandatory activity to evaluate the experiment. The corresponding work will be carried out by the CloudFlow Competence Center.
8.2 Milestones
Milestone No.
|
Milestone name
|
Activity(-ies) involved
|
Expected month
|
Comment
|
|
|
|
<#>
|
| 8.3 Deliverables
Del. No.
|
Deliverable name
|
Activity(-ies) involved
|
Nature2
|
Dissemination level3
|
Delivery month
|
|
|
|
R/P/D/ other
|
PU/PP/RE/CO
|
<#>
|
NB: In addition to the deliverables listed above, there are obligations to CloudFlow, including activity reporting (every three months), a final experiment report and project review contributions.
9 Resources committed (max. 1 page)
Participant number
|
Participant short name
|
Estimated eligible costs
|
Requested EC contri-bution (€)
|
Effort (PM)
|
Personnel costs (€)
|
Subcon-tracting (€)
|
Other direct costs (€)
|
Indirect costs (€)
|
Total costs (€)
|
1 (Lead)
|
|
<#>
|
<#>
|
<#>
|
<#>
|
<#>
|
<#>
|
<#>
|
2
|
|
|
|
|
|
|
|
|
3
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Total
|
|
|
|
|
|
|
|
Please fill the table. Explain clearly and justify your (types of) costs (other direct, subcontracting, etc.), e.g.:
-
use of HPC resources
-
software
-
etc.
Note that additional costs for the CloudFlow Competence Center do not have to be considered here. There is independent financing for existing CloudFlow partners of up to 40% (estimated, on average) of the total EC contribution requested by an experiment (maximum 110 000€ ). In sum the total per experiment EC contribution could add up to 110 000€ + 44 000€. In case the proposal is selected for financing the CloudFlow Competence Center will specify in detail the budget distribution amongst existing partners.
10 Consortium (max. 2 pages, max. ½ page per partner)
Please describe the consortium as a whole.
Please describe what each partner brings to the experiment and to the CloudFlow project.
Please provide a company profile, key personnel (per partner, existing CloudFlow partners need not be listed)
Partner name
|
|
Link to webpage
|
|
1-2 key person(s)
|
Max. 5 lines
|
Prior participation in EC-projects
|
Max. 5 lines
|
1. Industrial relevance
|
Score(0,1,2,3,4 or 5)
|
Justification of score:
|
|
2. Experiment design
|
Score(0,1,2,3,4 or 5)
|
Justification of score:
|
|
3. Technical impact
|
Score(0,1,2,3,4 or 5)
|
Justification of score:
|
|
4. Business impact
|
Score(0,1,2,3,4 or 5)
|
Justification of score:
|
|
5. Innovation and complementarity
|
Score(0,1,2,3,4 or 5)
|
Justification of score:
|
|
6. Exploitation
|
Score(0,1,2,3,4 or 5)
|
Justification of score:
|
|
7. Soundness of technical approach
|
Score(0,1,2,3,4 or 5)
|
|
Justification of score:
|
|
|
8. Quality of work plan
|
Score(0,1,2,3,4 or 5)
|
|
Justification of score:
|
|
|
9. Effective and justified deployment of resources
|
Score(0,1,2,3,4 or 5)
|
|
Justification of score:
|
|
|
10. Appropriateness of the consortium for the experiment
|
Score(0,1,2,3,4 or 5)
|
|
Justification of score:
|
|
|
Comments / instructions:
0
|
The proposal fails to address the criterion under examination or cannot be judged due to missing or incomplete information.
|
1
|
Very Poor - The criterion is addressed in an inadequate manner, or there are serious inherent weaknesses.
|
2
|
Poor - While the proposal broadly addresses the criterion, there are significant weaknesses.
|
3
|
Acceptable - The proposal addresses the criterion, although significant improvements are possible.
|
4
|
Good - The proposal addresses the criterion well, although certain improvements are still possible.
|
5
|
Very Good - The proposal successfully addresses all relevant aspects of the criterion in question. Any shortcomings are minor.
|
Only integer numbers are possible.
Rules for acceptance
-
Threshold for the total score is 30 out of 50.
-
At most 3 categories can be strictly below 3 points.
-
In case of a tie of two or more proposals reaching the same numerical score and lying at the cut-off for acceptance imposed by the available funding resources, the CloudFlow Competence Center will take the decision which one(s) to accept. This decision then has to be justified towards the European Commission as represented by the Scientific Officer of the CloudFlow project.
Share with your friends: |