Loading in 2 Seconds...
Loading in 2 Seconds...
EAB meeting, Philadelphia,1 Nov 2005. CS curricula update proposed: by adding Reconfigurable Computing. Reiner Hartenstein TU Kaiserslautern. Reconfigurable Computing. Embedded Systems. Computing Curricula 2004 (1). #. Computing Curricula 2004 (2). #. 2.2.1.
FlowwareParadigm Shifts: Nick Tredennick‘s view (3)
The Morphware Age
Software / Configware Co-Compiler
Design Starts until 2010: from 80,000 to 110,000 [Dataquest]
fastest growing segment of the semiconductor market: 4 billion US-$ [Dataquest]
FPGAs reduce power dissipation: MOPS / milliWatts by a factor of x10
Running and airconditioning: reducing the electricity bill up to millions of $ per year
*) Field-Programmable Gate ArrayFPGAs for Reconfigurable Computing (RC)
compared to µProcessors (intel, ...): speed-up by factors up to x100 and more
Strategic dimension has been appreciated.Exponential Growth & Strategic Dimension
Reconfigurable Computing (RC) became mainstream years ago, not only in Embedded Systems
Education is an essential factor to solve the current complexity crisis and creating a qualified workforce
Our students are not even aware, that we all now live in the Morphware Age, not in the Mainframe Age
Changing this will make CS much more fascinating
10/24/05; Vol. 24 No. 31 --- Ask the Professor: Reconfigurable Computing - By Joab Jackson -- GCN StaffThe computer science academic community has investigated the use of field-programmable gate arrays for quite some time. To get beyond the product hype, we interviewed associate professor Kris Gaj of George Mason University’s Department of Electrical and Computer Engineering, who has long been involved in reconfigurable computing. GCN: We’ve heard claims of anything from a 40- to 20,000-fold increase in performance speeds over standard commodity chips. What kind of improvement can users expect from a well-engineered program? Gaj: Our group has developed multiple applications for a few reconfigurable computers, from SRC, SGI and Cray. We have seen speed-ups compared to a single traditional microprocessor (Pentium 4) anywhere from none to over 1,000 [times]. The speed-up really depends on a particular task, and how well this task can be divided into smaller operations that can execute in parallel. [The claim of a] 20,000-times speed-up is probably an exaggeration, unless you use a lot of FPGAs, but such machines would really cost a fortune. GCN: Where is that performance improvement coming from? Gaj: A microprocessor executes instructions sequentially, one by one. A single instruction does only a small part of the job, so it takes a long time to complete the entire sequence of such instructions constituting the program. Additionally, a microprocessor cannot be reconfigured, so a lot of resources may need to be allocated for functions that will never be used by a particular program. An FPGA may execute multiple operations in parallel. Since it is reconfigurable, you do not need to waste any resources, such as circuit area, for implementing operations that are not used by a given program. The contents of an FPGA may also change on the fly, i.e., during the program execution, so you do not need to have all resources tied up at the beginning of computations. GCN: Do you predict companies like Cray and SGI can bring FPGA computing to a broader audience of users? Gaj: I would not expect an FPGA in every PC at home anytime soon. For a couple of years, the primary use of reconfigurable computers will be for scientific computations, such as weather simulations, space exploration, human genome project and simulation of nuclear reactions. These machines should be treated as an alternative for traditional supercomputers, and may eventually outperform and replace some or most of them. For bringing FPGAs to a broader audience, the prices must drop by at least an order of magnitude, and tools must be developed that make the programming of these machines much easier than it is right now. Additionally, in many cases, traditional microprocessors would be completely sufficient for [a] majority of personal and business applications.
almost 90% of all software is implemented for embedded systems
embedded software doubles every 10 months
FPGAs are inevitable for embedded systems
to-day, typical CS graduates are not qualified for this job market
hardware / configware / software partitioning problems cannot be handled
… the de facto basic model is a dual-paradigm system, however, not von-Neumann-only
… the florishing configware industry is the younger brother of the software industry
fragmentation into many application areas: teaching their own tricks – no common model
unstructured view onto creators‘ architectures, advertized by catchy terms („we are creative“)
no clear hierachical view by abstraction levels
no common terminology: maybe, managers do not understand what you are talking about
confusing mind set, no computing viewpoint: not seen as a common fundamental paradigm
teach already freshmen by dual-paradigm model
integrative undergraduate lab courses needed
teach code refactoring & algorithmic cleverness
CS is the only right point of view to fix all this
providing RC and embedded system qualifications to our students by common models – not tricks
making CS more fascinating: innovation by RC
reversing our membership development trend ?
CS is the only right place to provide all this
technology 20 years old, invented 1984 (Xilinx)
software–to-configware migration: all enabling methodologies available, some published in the 70ies or 80ies
deadline for submissions: November 27, 2005
Advanced Real Time Systems
Real-Time Systems (Sweden)
Recommendations for Designing New ICT Curricula
Chess - Center for Hybrid and Embedded Software Systems
(courses in embedded systems)
WESE - Workshop on Embedded Systems Education
WESEother curriculum recommendations
in these recommendations RC is not an issue so far:
action needed by CS
We need to counter the current education trend toward specialization
We need curricula to cope with the clash of cultures by merging all different backgrounds in a systematic way
CS curricula for unifying the foundations
We need innovative lectures and lab courses integrating reconfigurable computing into progressive CS curricula.
CS curricula should adopt the dichotomy of software engineering and configware engineering
CS undergraduate curricula must switch from von-Neuman-only to the dual paradigm model
Application domain‘s point of views cannot replace the urgently needed CS-based efforts ……..
Only CS is qualified to be conductor of RC-related curriculum recommendations and implementation