DIY Electronic projects

Full Version: Sve i svašta
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
jeste Bane,da! Smile
Ma vidim da mi je nesto poznat!

Gledao sam slike sa druzenja u KG(gde nazalost nisam dosao) a danas sam gledao sajt VISER skole i video diplomce. Gledao sam da vidim da li mi je neko poznat i kada ono, rekoh ovaj 'mali' mi je nesto poznat! Uvecam sliku i rekoh to je Bane!
Hahah, jeste, ja sam Big Grin Sa sve slikom sa upisa od pre 5 godina Big Grin
Samo za TDA Smile
[attachment=8574]
Hahaha!
Pa gde si ovo našao? Big Grin
I ja imam Plocice Smile
[Image: sramtebilo.jpg]
Ovo nije zez ali nemam gde da utrpam ovakvo pitanje.

Da li neko ima punu verziju programa "Circuit Maker 2000"? Meni se nesto zesce pokarabasalo sa tim programom i nema sanse da to ozivim. Spasao sam sve dizajne ali ne mogu da ih pokrenem jer sada imam samo besplatnu STUDENT verziju a ona nece da radi sa fajlovima sledece-novije generacije i fajlovima sa vise od 50 komponenata.
ZAVRSENO! "Samuki" priskocio u pomoc i program opet instaliran.

HVALA Samuki!!!
bravo Samuki,bravo!
Jesmo brzi Smile
Almost like "With a little help from my friends" just this time that was a BIG HELP!
Jedno malo obaveštenje.
Članovi Dragan100, apexaudio i Braca su zvog svojih zasluga i doprinosa ovom forumu upravo dobili VIP status. Smile
Čestitam!
Zjeee Pridružujem se čestitkama Zjeee
Takodje... BRAVO!
zasluženo pa dobijeno,nema ništa unapred! Smile čestitam drugari,i da nam se ne uobrazite sad! Smile
Kakve cifre, samo da vam se zavrti u glavi ~~~

http://wiki.apache.org/hadoop/PoweredBy

EBay
532 nodes cluster (8 * 532 cores, 5.3PB).
Heavy usage of Java MapReduce, Apache Pig, Apache Hive, Apache HBase
Using it for Search optimization and Research.

Facebook
We use Apache Hadoop to store copies of internal log and dimension data sources and use it as a source for reporting/analytics and machine learning.
Currently we have 2 major clusters:
A 1100-machine cluster with 8800 cores and about 12 PB raw storage.
A 300-machine cluster with 2400 cores and about 3 PB raw storage.
Each (commodity) node has 8 cores and 12 TB of storage.
We are heavy users of both streaming as well as the Java APIs. We have built a higher level data warehousing framework using these features called Hive (see the http://hadoop.apache.org/hive/). We have also developed a FUSE implementation over HDFS.

Yahoo!
More than 100,000 CPUs in >40,000 computers running Hadoop
Our biggest cluster: 4500 nodes (2*4cpu boxes w 4*1TB disk & 16GB RAM)
Used to support research for Ad Systems and Web Search
Also used to do scaling tests to support development of Apache Hadoop on larger clusters
Our Blog - Learn more about how we use Apache Hadoop.
>60% of Hadoop Jobs within Yahoo are Apache Pig jobs.
Jbg, pa rasejani su svuda po svetu...