International Symposium on Distributed Computing
- International Symposium on Distributed Computing
The International Symposium on DIStributed Computing (DISC) is a annual academic conference for refereed presentations whose focus is the theory, design, analysis, implementation and application of distributed systems and networks. The Symposium is organized in association with the European Association for Theoretical Computer Science (EATCS). The 22nd symposium was held in Arcachon, France, in September 22-34, 2008.
The dates back to 1985, when it began a as a biannual Workshop on Distributed Algorithms on Graphs (WDAG); it became annual in 1989. The name changed to the present one in 1998.
External links
* [http://www.disc-conference.org/ DISC website]
Wikimedia Foundation.
2010.
Look at other dictionaries:
Distributed computing — is a field of computer science that studies distributed systems. A distributed system consists of multiple autonomous computers that communicate through a computer network. The computers interact with each other in order to achieve a common goal … Wikipedia
Nimrod (distributed computing) — Nimrod is a tool for the parameterisation of serial programs to create and execute embarrassingly parallel programs over a computational grid. Nimrod was one of the first tools to make use of heterogeneous resources in a grid for a single… … Wikipedia
Distributed operating system — A distributed operating system is the logical aggregation of operating system software over a collection of independent, networked, communicating, and spatially disseminated computational nodes.[1] Individual system nodes each hold a discrete… … Wikipedia
Symposium on Theory of Computing — STOC, the Annual ACM Symposium on Theory of Computing is an academic conference in the field of theoretical computer science. STOC has been organized annually since 1969, typically in May or June; the conference is sponsored by the Association… … Wikipedia
Data Intensive Computing — is a class of parallel computing applications which use a data parallel approach to processing large volumes of data typically terabytes or petabytes in size and typically referred to as Big Data. Computing applications which devote most of their … Wikipedia
Kernel (computing) — A kernel connects the application software to the hardware of a computer In computing, the kernel is the main component of most computer operating systems; it is a bridge between applications and the actual data processing done at the hardware… … Wikipedia
Grid Computing — Dieser Artikel oder Abschnitt bedarf einer Überarbeitung. Näheres ist auf der Diskussionsseite angegeben. Hilf mit, ihn zu verbessern, und entferne anschließend diese Markierung. Grid Computing ist eine Form des verteilten Rechnens, bei der ein… … Deutsch Wikipedia
Grid computing — Dieser Artikel oder Abschnitt bedarf einer Überarbeitung. Näheres ist auf der Diskussionsseite angegeben. Hilf mit, ihn zu verbessern, und entferne anschließend diese Markierung. Grid Computing ist eine Form des verteilten Rechnens, bei der ein… … Deutsch Wikipedia
Organic Computing — bezeichnet eine interdisziplinäre Forschungsinitiative mit dem explanatorischen Anliegen, ein besseres Verständnis organischer Strukturen zu gewinnen und dem Entwicklungsziel einer organisch strukturierten Informationstechnologie. Die… … Deutsch Wikipedia
Real-time computing — In computer science, real time computing (RTC) is the study of hardware and software systems that are subject to a real time constraint i.e., operational deadlines from event to system response. By contrast, a non real time system is one for… … Wikipedia