TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Ask HN: What are the most interesting emerging fields in computer science?

381 点作者 Norther将近 7 年前
Hey HN,<p>What do you think is the most interesting emerging field in Computer Science? I&#x27;m interested in PHD areas, industry work, and movements in free software.

51 条评论

dbatten将近 7 年前
Secure Multi-party Computation.<p>The basic idea is developing methods for two (or more) parties with sensitive data to be able to compute some function of their data without having to reveal the data to one another.<p>The classic example is developing an algorithm that allows two people to figure out who is paid more without either revealing what their salary is.<p>Such algorithms get significantly more complicated if the threat model starts changing from &quot;we&#x27;re all acting in good faith, but we just don&#x27;t want to share this private info&quot; to &quot;I&#x27;m not sure some of the people involved in this are acting in good faith.&quot;<p>Based on my (admittedly limited) look into this field, it seems like there has been some theoretical progress made here, but there&#x27;s nothing like a generalized framework or library for general development with it. Instead, practical applications seem to be one-offs. For example, a contractor a while back developed a system that lets parties (nation-states or private space firms) figure out if their satellites are going to run into each other without revealing anything about the location or orbit of their satellites. That way they don&#x27;t share sensitive data, but they can move their satellites if they&#x27;re on a collision course with somebody else.<p>Personally, I got interested in this when working for the government. I was working on an extremely cool data integration project (State Longitudinal Data System grant form US Department of Education) that basically went nowhere because we couldn&#x27;t get over the legal hurdles to data sharing... If we didn&#x27;t have to share data, but could still compute interesting statistics about the data, that would have been really cool.
评论 #17697005 未加载
评论 #17697350 未加载
评论 #17696929 未加载
评论 #17697287 未加载
评论 #17701966 未加载
评论 #17698790 未加载
评论 #17699570 未加载
评论 #17701187 未加载
评论 #17697493 未加载
评论 #17698558 未加载
resiros将近 7 年前
I think most interesting computer science fields are actually application of CS in other domains.<p>Science changed a lot in the last decades, moving from a genius in a room looking at the data and coming up with grand theory to have vast amounts of data that no single human can make sense of. The work of the computer scientist is to quickly understand problems from various fields then solve it using tailor-made algorithm that leverage the prior knowledge, the data structure.<p>One of such interesting fields (which I&#x27;m working on), is computational biology. We&#x27;re working on leveraging sparse experimental data for protein structure prediction. To do that, we end up using algorithms and ideas from different various CS fields, from machine learning, to robotics, to distributed systems. Other people are working on exciting fields like computation protein design, studying drug protein interaction in silico..
评论 #17696739 未加载
评论 #17696832 未加载
评论 #17697489 未加载
评论 #17696719 未加载
评论 #17697362 未加载
georgewsinger将近 7 年前
Almost all of the answers on this list are not fields that are &quot;emerging&quot; but fields that &quot;have already emerged&quot;.<p>The ideal emerging field is one that&#x27;s so obscure we haven&#x27;t heard of it yet, but so important that we will. If there are widely disseminated books on Amazon about your field, it&#x27;s not emerging. If there are hundreds of professionals cranking out papers about your field, it&#x27;s also not emerging.<p>Emerging fields are underrated and under-recognized. What are they?
评论 #17697199 未加载
评论 #17696644 未加载
评论 #17698065 未加载
评论 #17701363 未加载
评论 #17699894 未加载
gota将近 7 年前
Process Mining [1]. When I programmed a rather complex logistics simulator at work I told my coworkers &#x27;whoever comes up with a way of instantiating a simulator from data will be praised forever&#x27;. Turned out process discovery is a thing (well, one of _the_ things in PM). And there&#x27;s so much cool stuff to do and being done. I&#x27;m now on the last stretch of my doctorate researching the mining of typical plans in non-competitive environments.<p>[1] <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Process_mining" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Process_mining</a>
评论 #17696695 未加载
meuk将近 7 年前
AI, machine learning, and neural networks are, of courwe, booming, but I consider them to be hyped.<p>I consider type theory and formal verification to be more promising (but more academic). Distributed systems and everything having to do with parallel and&#x2F;or high-performance systems is a good midway between what the industry likes and what&#x27;s interesting from an academic point of view.
评论 #17696598 未加载
评论 #17696548 未加载
stared将近 7 年前
Various field of Deep Learning. Right now - Reinforcement Learning.<p>See: <a href="https:&#x2F;&#x2F;www.forbes.com&#x2F;sites&#x2F;louiscolumbus&#x2F;2018&#x2F;01&#x2F;12&#x2F;10-charts-that-will-change-your-perspective-on-artificial-intelligences-growth&#x2F;#38090e534758" rel="nofollow">https:&#x2F;&#x2F;www.forbes.com&#x2F;sites&#x2F;louiscolumbus&#x2F;2018&#x2F;01&#x2F;12&#x2F;10-cha...</a> or in general any other marker like NIPS submissions or arXiv preprints on DL.<p>Of course focus changes, and maybe in the next 2 years it will be on something different than RL. But still, even in Computer Vision it is still a very vibrant field, since its breakthrough in late 2012 (<a href="https:&#x2F;&#x2F;www.eff.org&#x2F;ai&#x2F;metrics" rel="nofollow">https:&#x2F;&#x2F;www.eff.org&#x2F;ai&#x2F;metrics</a>). The majority of more traditional disciplines of CS had their breakthroughs a few decades ago.
GolDDranks将近 7 年前
Homomorphic encryption is a mind-blower. But I fear that we may never see it in it&#x27;s fullest glory. It&#x27;s going to be computationally too expensive or too impractical for reason or another. One can still hope.
评论 #17696999 未加载
评论 #17696626 未加载
评论 #17698640 未加载
kmisiunas将近 7 年前
DNA computing might be an interesting new domain [1]. The idea is to use DNA as a memory, while using proteins&#x2F;RNA as logic operators. This can provide massive speed and efficiency gains, especially for optimisation problems that need parallelization. Just consider that 4bits of information on DNA take only about 1nm^3 of volume, where solid state memory has about 3Tb&#x2F;in^2 which is roughly equivalent to 10^7 nm^3.<p>To me it is still not clear how scalable the DNA computing is, but there are nice proofs of concept already [2].<p>[1] <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;DNA_computing" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;DNA_computing</a> [2] <a href="https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;s41586-018-0289-6" rel="nofollow">https:&#x2F;&#x2F;www.nature.com&#x2F;articles&#x2F;s41586-018-0289-6</a>
评论 #17699532 未加载
QML将近 7 年前
Algorithmic Game Theory [1]<p>Going into CS as an undergrad, I didn&#x27;t anticipate the depth that the field had in other domains -- and for some time, I wanted to double major in {math, biology, economics} to supplement my education.<p>However, while in the algorithms course, I stumbled upon a connection between linear programming and 2-player zero-sum games (the minimax theorem [2]). Up to that point, I had never considered the idea of using a computational lens to view problems outside of CS, such as &quot;what is the complexity of Nash equilibrium?&quot;<p>It turns out Algorithmic Game Theory can be applied to study theory of auctions (Why does Ebay use 2nd-priced auctions?) [3], tournament design (Why would a team purposely lose?) [4], or something as basic as routing (Why does building more roads lead to more congestion?).<p>[1] <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Algorithmic_game_theory" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Algorithmic_game_theory</a><p>[2] <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Minimax_theorem" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Minimax_theorem</a><p>[3] <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Auction_theory" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Auction_theory</a><p>[4] <a href="https:&#x2F;&#x2F;theory.stanford.edu&#x2F;~tim&#x2F;f13&#x2F;l&#x2F;l1.pdf" rel="nofollow">https:&#x2F;&#x2F;theory.stanford.edu&#x2F;~tim&#x2F;f13&#x2F;l&#x2F;l1.pdf</a>
dasmoth将近 7 年前
I don’t know for sure, but I certainly <i>hope</i> we’ll see some fresh thinking about user interface design and construction. The past couple of decades seem to have been substantially about recapitualating what came before in the web browser, and while webification has it’s good sides (easier deployment), the actual interfaces for data-entry type tasks still seem as clunky as ever.<p>AR is potentially an interesting sub-field, but doesn’t seem to be the answer for everything (e.g. those form-like data entry tools...)
评论 #17697082 未加载
deadalus将近 7 年前
Deep Video Portraits - <a href="https:&#x2F;&#x2F;web.stanford.edu&#x2F;~zollhoef&#x2F;papers&#x2F;SG2018_DeepVideo&#x2F;page.html" rel="nofollow">https:&#x2F;&#x2F;web.stanford.edu&#x2F;~zollhoef&#x2F;papers&#x2F;SG2018_DeepVideo&#x2F;p...</a><p>DeepFake Creation Tools - <a href="https:&#x2F;&#x2F;voat.co&#x2F;v&#x2F;DeepFake&#x2F;2405562" rel="nofollow">https:&#x2F;&#x2F;voat.co&#x2F;v&#x2F;DeepFake&#x2F;2405562</a><p>Adobe Voco = <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Adobe_Voco" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Adobe_Voco</a>
lldata将近 7 年前
CRDT&#x27;s looks interesting and are new enough, that there will be more to learn about them. Seems like an important component in distributed systems (almost all new systems)<p><a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Conflict-free_replicated_data_type" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Conflict-free_replicated_data_...</a>
Dowwie将近 7 年前
Computational law. See: <a href="http:&#x2F;&#x2F;logic.stanford.edu&#x2F;complaw&#x2F;complaw.html" rel="nofollow">http:&#x2F;&#x2F;logic.stanford.edu&#x2F;complaw&#x2F;complaw.html</a>
评论 #17749011 未加载
chriswait将近 7 年前
One interesting approach might be to work backwards from desired practical applications: <a href="https:&#x2F;&#x2F;www.gartner.com&#x2F;smarterwithgartner&#x2F;gartner-top-10-strategic-technology-trends-for-2018&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.gartner.com&#x2F;smarterwithgartner&#x2F;gartner-top-10-st...</a>
randcraw将近 7 年前
Personally, I think higher order cognition in AI will be hot. Deep learning has monetized the introduction of AI into numerous mainstream domains (e.g. smart NLP search, vision, speech, game RL), which will motivate and underwrite efforts to push AI beyond the shallow hacks of the past, finally breaking through AI&#x27;s brittleness problem.<p>Some problem domains are killer apps, like self driving cars, and on smartphones, personal digital assistants and verbal interfaces. There&#x27;s no stopping these initiatives. They only question is how far each can go without moon shot levels of investment. But between the economic interests of especially Google and Apple to advance their mobile devices, and the military to make weapons and intel as smart as possible, I&#x27;m convinced there&#x27;s enough critical mass for AI&#x27;s pile to stay hot for a couple of decades or more.<p>The trick is to avoid the roadblocks that today&#x27;s academic agenda inflicts on researchers by demanding they publish frequent shallow incremental novelties. What&#x27;s needed is 5-10 years to develop the infrastructure that&#x27;s needed to enable the fielding of robust general reasoning w&#x2F; causation and rich knowledgebases.
ArtWomb将近 7 年前
Within a year or two, we will see grad-level courses at top CS programs in Chaos Engineering and SRE. Just as we&#x27;ve seen the addition of Distributed Systems classes in the past few years with introductions to Zookeeper, Paxos, BigTable, Raft and Spanner. There will be an explosion in academic work on the science of &quot;failure-injection&quot; methods ;)
评论 #17697398 未加载
akqu将近 7 年前
There are currently huge opportunities in applied computing for people who can break out of the status quo. There has never been such a big gap between what technology can do and what technology culture can&#x27;t. Of course it isn&#x27;t easy. As there also never been easier to waste time in technology.
jonbaer将近 7 年前
I think anything &quot;emerging&quot; will come from the forms of unconventional computing [1] ... ML&#x2F;DL&#x2F;AI are being rehashed on faster silicon hardware, I wouldn&#x27;t call it hype but it will be better applied to another form of hardware - once it&#x27;s realized. I personally think reversible computing [2] (once understood) to make the most sense in terms of energy efficiency in CS (much needed) ...<p>[1] <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Unconventional_computing" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Unconventional_computing</a><p>[2] <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Reversible_computing" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Reversible_computing</a>
gnode将近 7 年前
Out-of-order execution and caches are once again emerging fields, unfortunately.
评论 #17696683 未加载
therealmarv将近 7 年前
Not so much science in this list, more personal and practical view:<p>As a web developer: WebAssembly<p>As a DevOp: Kubernetes<p>As a backend engineer: headless (CMS) API systems like Strapi or Wagtail
评论 #17698821 未加载
crawfordcomeaux将近 7 年前
Help me create a field of human programming that&#x27;s informed by computer science? Below is a simplified description of some ideas informing what I do. These days I&#x27;m more focused on language and behaviors in myself and primary relationship in preparation for our first child, so it&#x27;d be nice if someone else started working on the theoretical stuff. I&#x27;m also down for informally experimenting with things anyone comes up with from this.<p>Here&#x27;s the basis:<p>Start with a category theoretical model connecting neuroanatomy and thought (MENS, category theory and the hippocampus). Combine with the concepts of universal embedding and fully abstract languages. Replace computers in the previous sentence with computational model of human; I&#x27;m playing with modified versions of differentiable neural computers and perceptual sets comprised of beliefs, emotions, intentions, and behavior&#x2F;thought patterns. Choose a human language to mathematically hack into a strict subset of itself so it meets the requirements for a target language in universal embedding. I suspect some form of type theory might be needed for that; coeffects seem like they could be useful, as well as quantitative type theory. Use yourself as the primary experimental subject (ie. the test machine) to help guide things and don&#x27;t worry about reproducible results...trust that you&#x27;re an ordinary human with essentially the same cognitive functions as everyone else, for now. Explore how this can impact human relationships. Discover ways to organize the self in such a way as to more effectively organize at scale.<p>Teach the world how to program itself at an individual level.
评论 #17699902 未加载
laser将近 7 年前
Only because I don&#x27;t yet see it mentioned, the one emerging field to rule them all: program synthesis. :P
评论 #17697650 未加载
montalbano将近 7 年前
Zero knowledge proofs.<p>Secure execution environments.
jpamata将近 7 年前
Graphical models[0] &amp; probabilistic programming[1], with the latter making it easier for developers to dive into this growing AI trend. Research in the field for the past decade has been steadily booming with more companies like Microsoft leading the way. I recommend checking out some MOOCs[2] in coursera.<p>[0]<a href="http:&#x2F;&#x2F;www.computervisionblog.com&#x2F;2015&#x2F;04&#x2F;deep-learning-vs-probabilistic.html" rel="nofollow">http:&#x2F;&#x2F;www.computervisionblog.com&#x2F;2015&#x2F;04&#x2F;deep-learning-vs-p...</a><p>[1]<a href="http:&#x2F;&#x2F;probabilistic-programming.org&#x2F;wiki&#x2F;Home" rel="nofollow">http:&#x2F;&#x2F;probabilistic-programming.org&#x2F;wiki&#x2F;Home</a><p>[2]<a href="https:&#x2F;&#x2F;www.coursera.org&#x2F;specializations&#x2F;probabilistic-graphical-models" rel="nofollow">https:&#x2F;&#x2F;www.coursera.org&#x2F;specializations&#x2F;probabilistic-graph...</a>
tabtab将近 7 年前
An informal HN survey on what some feel is the future vs. over-hyped: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=17129481" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=17129481</a>
onion2k将近 7 年前
Functional correctness, formal verification and automated bug fixing.
vortico将近 7 年前
Since you mentioned movements in free software, &quot;open core&quot; has been emerging in the last 4 years as a viable way to do business while allowing other individuals and companies to build onto your core platform while still being able to monetize your work. For example, if you launch a startup specializing in building foo, you can maintain a library called libfoo and sell a larger foo application or foo plugins or foo services using the open-source library you created.
deepaksurti将近 7 年前
Open source computing hardware. RISC V[0]<p>[0] <a href="https:&#x2F;&#x2F;riscv.org&#x2F;risc-v-foundation&#x2F;" rel="nofollow">https:&#x2F;&#x2F;riscv.org&#x2F;risc-v-foundation&#x2F;</a>
KaiserPro将近 7 年前
Low power sensor and associated networks.<p>Its hard to do and has lots of real world applications.
RantyDave将近 7 年前
I am <i>certain</i> there will be an emerging field in AI for engineering. Suspension that &#x27;learns&#x27; how to keep the car flat; buildings that start shuffling warm air from a to b before it&#x27;s needed ... things like that. Programming is going to change from &quot;explain how to do it&quot; to &quot;show it what you want&quot;, and this has got to be a big deal.
评论 #17697321 未加载
评论 #17696930 未加载
评论 #17696722 未加载
msbroadf将近 7 年前
Fully Homomorphic Encryption
otakucode将近 7 年前
Amorphous computing has always seemed interesting to me. Computing with emergent phenomenon amongst scatterings of large numbers of unreliable simple processors - like the sort of thing you could mix into paint. It&#x27;s very young, with lots of fundamentals to be worked out, but that&#x27;s what makes it interesting!
jfilter将近 7 年前
Wide-scale adoption and promotion of open source software in the industry (e.g. Microsoft, Facebook, Google)
simonhughes22将近 7 年前
Adversarial Machine Learning. Fake news detection using ML. Integrating &#x27;good old fashioned AI&#x27; ideas with modern ML techniques - to some extent Alpha Go went in this direction. While I am glad AI has moved far more towards the machine learning direction, i suspect the decades of AI research that preceded it may come back in a form that is combined with more modern techniques in some way. I see Alpha Go (and Alpha Zero) as steps in that direction. Also, applying deep learning to search engines and making that scale efficiently. I suspect google has partly solved this already, but haven&#x27;t gone public with any of the tech, but that&#x27;s pure speculation.
评论 #17698785 未加载
mongol将近 7 年前
Quantum computing?
评论 #17696605 未加载
lazyjones将近 7 年前
Swarm Computing. The hardware and networking to make it practical and useful exist now, but the field is still in its infancy. There‘s some discussion about its use in autonomous driving, construction, warfare.
评论 #17697150 未加载
dalbasal将近 7 年前
Just as an angle to answering the question (I don&#x27;t have answers of my own)....<p>What was the most interesting CS field(s) in 2008, 98, 88, etc?
评论 #17696682 未加载
vikaskyadav将近 7 年前
Computer Vision.
评论 #17696406 未加载
评论 #17696819 未加载
doomjunky将近 7 年前
Functional programming!<p>Functional programming languages have several classic features that are now gradually adopted by none FP languages.<p>Lambda expressions [1] is one such feature originating from FP languages such as Standard ML (1984) or Haskell (1990) that is now implemented in C#3.0 (2007), C++11 (2011), Java8 (2014) and even JavaScript (ECMAScript 6, 2015).<p>Pattern matching [2] is another feature that is now^2015 implemented in C#7.0. My bet is that Java and other will follow in the next versions.<p>Here is a list of FP features. Some of which are already adopted by none FP languages: Lambda expressions, Higher order functions, Pattern matching, Currying, List comprehension, Lazy evaluation, Type classes, Monads, No side effects, Tail recursion, Generalized algebraic datatypes, Type polymorphism, Higher kinded types, First class citicens, Immutable variables.<p>[1] <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Lambda_calculus" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Lambda_calculus</a><p>[2] <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Pattern_matching" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Pattern_matching</a>
kitanata将近 7 年前
The use of Lattice Boltzmann Equations to parelleize computation. It’s alreadg being used in doing dynamic fluid simulations but it’s applications are pretty endless. I wouldn’t be surprised to see LBM translated for use in Machine Learning, Cognitive AI, NLP, Computational Biology, etc, etc.
amino将近 7 年前
Programatic identification and comprehension of morality in software applications is a fascinating area!
k__将近 7 年前
I think the core disciplines are the same, sometimes just some &quot;updates&quot; are happening. Like AI or CV in the last years.<p>The big changes are happening in engineering (software and hardware).<p>Many things that were known for decades are now accessible for a broader audience.
I_am_tiberius将近 7 年前
Quantum computing and cryptography
azhenley将近 7 年前
Human-computer interaction. The field is not new but the way people interact with computers has drastically changed in the last ten years and will probably continue to do so.
评论 #17703130 未加载
al_ramich将近 7 年前
The hype is where the money is and if you look at the established emerging tech, AI and IoT are projected to get most funding and create most disruption in the years to come <a href="https:&#x2F;&#x2F;uk.pcmag.com&#x2F;feature&#x2F;94662&#x2F;blockchain-and-robots-buzzy-but-not-yet-vc-blockbusters" rel="nofollow">https:&#x2F;&#x2F;uk.pcmag.com&#x2F;feature&#x2F;94662&#x2F;blockchain-and-robots-buz...</a><p>The cutting-edge emerging tech I feel will be in the way we engage with data and tech and Augmented Intelligence (assistive) will see huge advancements.
tugberkk将近 7 年前
Internet of Things. It is still in development and there are lots of stuff to work on.
评论 #17696595 未加载
评论 #17696699 未加载
arisAlexis将近 7 年前
direct acyclic graphs for blockchain use. homomorphic cryptography.
jklein11将近 7 年前
Targeted Advertising
tastyham将近 7 年前
Quantum Computing
fl0tingh0st将近 7 年前
Computer Networks
评论 #17696720 未加载
wellboy将近 7 年前
Blockchain, though no one on hn really understands it. :D
评论 #17696680 未加载
评论 #17696578 未加载