TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Toward a better programming

297 点作者 ibdknox大约 11 年前

49 条评论

kens大约 11 年前
I alternate between thinking that programming has improved tremendously in the past 30 years, and thinking that programming has gone nowhere.<p>On the positive side, things that were cutting-edge hard problems in the 80s are now homework assignments or fun side projects. For instance, write a ray tracer, a spreadsheet, Tetris, or an interactive GUI.<p>On the negative side, there seems to be a huge amount of stagnation in programming languages and environments. People are still typing the same Unix commands into 25x80 terminal windows. People are still using vi to edit programs as sequential lines of text in files using languages from the 80s (C++) or 90s (Java). If you look at programming the Eniac with patch cords, we&#x27;re obviously a huge leap beyond that. But if you look at programming in Fortran, what we do now isn&#x27;t much more advanced. You&#x27;d think that given the insane increases in hardware performance from Moore&#x27;s law, that programming should be a lot more advanced.<p>Thinking of Paul Graham&#x27;s essay &quot;What you can&#x27;t say&quot;, if someone came from the future I expect they would find our current programming practices ridiculous. That essay focuses on things people don&#x27;t say because of conformity and moral forces. But I think just as big an issue is things people don&#x27;t say because they literally can&#x27;t say them - the vocabulary and ideas don&#x27;t exist. That&#x27;s my problem - I can see something is very wrong with programming, but I don&#x27;t know how to explain it.
评论 #7489937 未加载
评论 #7490535 未加载
评论 #7490118 未加载
评论 #7492119 未加载
评论 #7490491 未加载
评论 #7490708 未加载
评论 #7490004 未加载
评论 #7489717 未加载
评论 #7492981 未加载
评论 #7492770 未加载
评论 #7491227 未加载
freyrs3大约 11 年前
This strikes me as armchair philosophizing about the nature of programming language design. Programming languages are not intentionally complex in most cases, they&#x27;re complex because the problems they solve are genuinely hard and not because we&#x27;ve artificially made them that way.<p>There is always a need for two types of languages, higher level domain languages and general purpose languages. Building general purpose languages is a process of trying to build abstractions that always have a well-defined translation into something the machine understands. It&#x27;s all about the cold hard facts of logic, hardware and constraints. Domain languages on the other hand do exactly what he describes, &quot;a way of encoding thought such that the computer can help us&quot;, such as Excel or Matlab, etc. If you&#x27;re free from the constraint of having to compile arbitrary programs to physical machines and can instead focus on translating a small set of programs to an abstract machine then the way you approach the language design is entirely different and the problems you encounter are much different and often more shallow.<p>What I strongly disagree with is claiming that the complexities that plague general purpose languages are somehow mitigated by building more domain specific languages. Let&#x27;s not forget that &quot;programming&quot; runs the whole gamut from embedded systems programming in assembly all the way to very high level theorem proving in Coq and understanding anything about the nature of that entire spectrum is difficult indeed.
评论 #7489011 未加载
评论 #7488989 未加载
评论 #7491976 未加载
RogerL大约 11 年前
There&#x27;s a reason the game Pictionary is hard, despite the &quot;a picture is worth a thousand words&quot; saying. And that is that images, while evocative, are not very precise. Try to draw how you feel.<p>If you are using card[0][12] to refer to Card::AceSpades, well, time to learn enums or named constants. If, on the other hand, the array can be sorted, shuffled, and so on, what value is it to show an image of a specific state in my code?<p>There&#x27;s a reason we don&#x27;t use symbolic representation of equations, and it has nothing to do with ASCII. It&#x27;s because this is implemented on a processor that simulates a continuous value with a discrete value, which introduces all kinds of trade offs. We have a live thread on that now: why is a<i>a</i>a<i>a</i>a<i>a not (a</i>a<i>a)</i>(a<i>a</i>a). I need to be able to represent exactly how the computation is done. If I don&#x27;t care, there is Mathematica, and and the like, to be sure.<p>If you disagree with me, please post your response in the form of an image. And then we will have a discussion with how powerful textual representation actually is. I&#x27;ll use words, you use pictures. Be specific.
评论 #7489458 未加载
评论 #7489555 未加载
评论 #7492101 未加载
评论 #7491626 未加载
j2kun大约 11 年前
I&#x27;m concerned about Chris&#x27;s desire to express mathematical formulas directly in an editing environment.<p>Coming from a mathematician with more than enough programming experience under his belt, programming is far more rigorous than mathematics. The reason nobody writes math in code is not because of ASCII, and it&#x27;s not even because of the low-level hardware as someone else mentioned. It&#x27;s because math is so jam-packed with overloaded operators and ad hoc notation that it would be an impossible feat to standardize any nontrivial subset of it. This is largely because mathematical notation is designed for compactness, so that mathematicians don&#x27;t have to write down so much crap when trying to express their ideas. Your vision is about accessibility and transparency and focusing on problem solving. Making people pack and unpack mathematical notation to understand what their program is doing goes against all three of those!<p>So where is this coming from?<p>PS. I suppose you could do something like, have layovers&#x2F;mouseovers on the typeset math that give a description of the variables, or something like that, but still sum(L) &#x2F; len(L) is so much simpler and more descriptive than \sigma x_i &#x2F; n
评论 #7490940 未加载
评论 #7491649 未加载
评论 #7491461 未加载
mamcx大约 11 年前
Natural language (like english, spanish) show why this kind of thinking lead to nowhere, and why a programming language is more like english than like glyphs.<p>Sometime the post not say: We want to make a program about <i>everything</i>. To make that possible, is necesary a way to express everything that could be need to be communicate. Words&#x2F;Alphabet provide the best way.<p>In a normal language, when a culture discover something (let say, internet) and before don&#x27;t exist words to describe internet-things then it &quot;pop&quot; from nowhere to existence. Write language have this ability in better ways than glyphs.<p>In programming, if we need a way to express how loop things, then will &quot;pop&quot; from nowhere that &quot;FOR x IN Y&quot; is how that will be.<p>Words are more flexible. Are cheap to write. Faster to communicate and cross boundaries.<p>But of course that have a Editor helper so a HEX value could be show as a color is neat - But then if a HEX value is NOT a color?, then you need a very strong type system, and I not see how build one better than with words.
zwieback大约 11 年前
Interesting work and I really liked the LightTable video but I think there&#x27;s a reason these types of environments haven&#x27;t taken off.<p>To understand why programming remains hard it just takes a few minutes of working on a lower-level system, something that does a little I&#x2F;O or has a couple of concurrent events, maybe an interrupt or two. I cannot envision a live system that would allow me to debug those systems very well, which is not to say current tools couldn&#x27;t be improved upon.<p>One thing I&#x27;ve noticed working with embedded ARM systems is that we now have instruction and sometimes data trace debuggers that let us rewind the execution of a buggy program to some extent. The debugger workstations are an order of magnitude more powerful than the observed system so we can do amazing things with our trace probes. However, high-level software would need debugging systems an order of magnitude more powerful than the client they debug as well.
评论 #7490722 未加载
jostylr大约 11 年前
Both the indirect and incidentally complex can be helped with literate programming. We have been telling stories for thousands of years and the idea of literate programming is to facilitate that. We do not just tell them in a linear order, but jump around in whatever way makes sense. It is about understanding the context of the code which can be hard.<p>But the problem of being unobservable is harder. Literate programming might help in making chunks more accessible for understanding&#x2F;replacing&#x2F;toggling, but live flow forwards-backwards, it would not. But I have recently coded up an event library that logs the flow of the program nicely. Used appropriately, it probably could be used to step in and out as well.<p>I am not convinced that radical new tools are needed. We just have to be true to our nature as storytellers.<p>I find it puzzling why he talks about events as being problems. They seem like ideal ways of handling disjointed states. Isn&#x27;t that how we organize our own ways?<p>I also find it puzzling to promote Excel&#x27;s model. I find it horrendous. People have done very complex things with it which are fragile and incomprehensible. With code, you can read it and figure it out; literate programming helps this tremendously. But with something like Excel or XCode&#x27;s interface builder, the structure is obscured and is very fragile. Spreadsheets are great for data entry, but not for programming-type tasks.<p>I think creation is rather easy; it is maintenance that is hard. And for that, you need to understand the code.
chenglou大约 11 年前
I have a tremendous respect for people who dare to dream big despite all cynicism and common assumptions, and especially people who have the skills to actually make the changes. Please keep doing the work you&#x27;re doing.
Detrus大约 11 年前
Toward a better computer UI<p>The Aurora demo did not look like a big improvement until maybe <a href="http://youtu.be/L6iUm_Cqx2s?t=7m54s" rel="nofollow">http:&#x2F;&#x2F;youtu.be&#x2F;L6iUm_Cqx2s?t=7m54s</a> where the TodoMVC demo beats even Polymer in LOC count and readability.<p>I&#x27;ve been thinking of similar new &quot;programming&quot; as the main computer UI, to ensure it&#x27;s easy to use and the main UI people know. Forget Steve Jobs and XEROX, they threw out the baby with the bath water.<p>Using a computer is really calling some functions, typing some text input in between, calling some more.<p>Doing a few common tasks today is<p><pre><code> opening a web browser clicking Email reading some replying getting a reply back, possibly a notification clicking HN commenting on an article in a totally different UI than email going to threads tab manually to see any response </code></pre> And the same yet annoyingly different UI deal on another forum, on youtube, facebook, etc. Just imagine what the least skilled computer users could do if you gave them a computing interface that didn&#x27;t reflect the world of fiefdoms that creates it.<p>FaceTwitterEtsyRedditHN fiefdoms proliferate because of the separation between the XEROX GUI and calling a bunch of functions in Command Line. Siri and similar AI agents are the next step in simple UIs. What people really want to do is<p><pre><code> tell Dustin you don&#x27;t agree with his assessment of Facebook&#x27;s UI changes type&#x2F;voice your disagreement share with public </code></pre> And when you send Dustin and his circle of acquaintances a more private message, you<p><pre><code> type it share message with Dustin and his circle of designers&#x2F;hackers </code></pre> To figure out if more people agreed with you or Dustin<p><pre><code> sentiment analysis of comments about Dustin&#x27;s article compared to mine </code></pre> That should be the UI more or less. Implement it however, natural language, Siri AI, a neat collection of functions.<p>Today&#x27;s UI would involve going to a cute blog service because it has a proper visual template. This requires being one of the cool kids and knowing of this service. Then going to Goolge+ or email for the more private message. Then opening up an IDE or some text sentiment API and going through their whole other world of incantations.<p>Our glue&#x2F;CRUD programming is a mess because using computers in general is a mess.
sold大约 11 年前
The standard deviation is a poor example IMO, in many languages you can get much closer to mathematical notation.<p><pre><code> def stddev(x): avg = sum(x)&#x2F;len(x) return sqrt(sum((xi-avg)**2 for xi in x) &#x2F; len(x)) stddev xs = let avg = sum xs &#x2F; length xs in sqrt $ sum [(x-avg)**2 | x &lt;- xs] &#x2F; length xs</code></pre>
评论 #7490394 未加载
评论 #7490228 未加载
qnaal大约 11 年前
Hate to break it to you people, but rms was always right- the #1 reason why programming sucks is that everyone wants complete control over all of the bullshit they threw together and thought they could sell.<p>Imagine an environment like a lisp machine, where all the code you run is open and available for you to inspect and edit. Imagine a vast indexed, cross-referenced, and mass-moderated collection of algorithm implementations and code snippets for every kind of project that&#x27;s ever been worked on, at your fingertips.<p>Discussing how we might want slightly better ways to write and view the code we have written is ignoring the elephant problem- that everything you write has probably been written cleaner and more efficiently several times before.<p>If you don&#x27;t think that&#x27;s fucked up, think about this: The only reason to lock down your code is an economic one, despite that all the code being made freely usable would massively increase the total economic value of the software ecosystem.
评论 #7492141 未加载
crusso大约 11 年前
I liked this article. I particularly liked the way the author attacked the problem by clearing his notions of what programming is and attempting to come at it from a new angle. I&#x27;ll be interested to see what his group comes up with.<p>That said, I think that fundamentally the problem isn&#x27;t with programming, it&#x27;s with US. :) Human beings are imprecise, easily confused by complexity, unable to keep more than a couple of things in mind at a time, can&#x27;t think well in dimensions beyond 3 (if that), unable to work easily with abstractions, etc. Yet we&#x27;re giving instructions to computers which are (in their own way) many orders of magnitude better at those tasks.<p>Short of AI that&#x27;s able to contextually understand what we&#x27;re telling them to do, my intuition is that the situation is only going to improve incrementally.
评论 #7489741 未加载
bachback大约 11 年前
Leibniz wrote in 1666: &quot;We have spoken of the art of complication of the sciences, i.e., of inventive logic... But when the tables of categories of our art of complication have been formed, something greater will emerge. For let the first terms, of the combination of which all others consist, be designated by signs; these signs will be a kind of alphabet. It will be convenient for the signs to be as natural as possible—e.g., for one, a point; for numbers, points; for the relations of one entity with another, lines; for the variation of angles and of extremities in lines, kinds of relations. If these are correctly and ingeniously established, this universal writing will be as easy as it is common,and will be capable of being read without any dictionary; at the same time, a fundamental knowledge of all things will be obtained. The whole of such a writing will be made of geometrical figures, as it were, and of a kind of pictures — just as the ancient Egyptians did, and the Chinese do today. Their pictures, however, are not reduced to a fixed alphabet... with the result that a tremendous strain on the memory is necessary, which is the contrary of what we propose&quot; <a href="http://en.wikipedia.org/wiki/Characteristica_universalis" rel="nofollow">http:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Characteristica_universalis</a>
评论 #7490090 未加载
PaulAJ大约 11 年前
The standard deviation example conflates two questions:<p>1: Why can&#x27;t we use standard mathematical notation instead of strings of ASCII?<p>2: Why do we need lots of control flow and libraries when implementing a mathematical equation as an algorithm.<p>The first is simple: as others have pointed out here, math notation is too irregular and informal to make a programming language out of it.<p>The second is more important. In pretty much any programming language I can write:<p><pre><code> d = sqrt (b^2 - 4*a*c) x1 = (-b + d)&#x2F;(2*a) x2 = (-b - d)&#x2F;(2*a) </code></pre> which is a term-by-term translation of the quadratic equation. But when I want to write this in C++ I need a loop to evaluate the sigma term.<p>But in Haskell I can write this:<p><pre><code> stDev :: [Double] -&gt; Double stDev xs = sqrt((1&#x2F;(n-1)) * sum (map (\x -&gt; (x-m)^2)) xs) where n = fromIntegral $ length xs m = sum xs &#x2F; n </code></pre> This is a term-by-term translation of the formula, in the same way that the quadratic example was. Just as I use &quot;sqrt&quot; instead of the square root sign I use &quot;sum&quot; instead of sigma and &quot;map&quot; with a lambda expression to capture the internal expression.<p>Experienced programmers will note that this is an inefficient implementation because it iterates over the list three times, which illustrates the other problem with using mathematics; the most efficient algorithm is often not the most elegant one to write down.
phantomb大约 11 年前
Historically it has been easy to claim that programming is merely incidentally complex but hard to actually produce working techniques that can dispel the complexity.<p>The truth is that programming is one of the most complex human undertakings by nature, and many of the difficulties faced by programmers - such as the invisible and unvisualizable nature of software - are intractable.<p>There are still no silver bullets.<p><a href="http://en.wikipedia.org/wiki/No_Silver_Bullet" rel="nofollow">http:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;No_Silver_Bullet</a> <a href="http://faculty.salisbury.edu/~xswang/Research/Papers/SERelated/no-silver-bullet.pdf" rel="nofollow">http:&#x2F;&#x2F;faculty.salisbury.edu&#x2F;~xswang&#x2F;Research&#x2F;Papers&#x2F;SERelat...</a>
dude42大约 11 年前
Sadly I feel that LT has jumped the shark at this point. What started off as a cool new take on code editors has now somehow turned into a grand view of how to &quot;fix programming&quot;. I can get behind an editor not based around text files, or one that allows for easy extensbility. But I can&#x27;t stand behind some project that tries to &quot;fix everything&quot;.<p>As each new version of LT comes out I feel that it&#x27;s suffering more and more from a clear lack of direction. And that makes me sad.
JoelOtter大约 11 年前
Forgive me if my understanding is totally out of whack, but it seems here that the writer is calling for an additional layer of abstraction in programming - type systems being an example.<p>While in some cases that would be great, I&#x27;m not entirely sure more abstraction is what I want. Having a decent understanding of the different layers involved, from logic gates right up to high-level languages, has helped me tremendously as a programmer. For example, when writing in C, because I know some of the optimisations GCC makes, I know where to sacrifice efficiency for readability because the compiler will optimise it out anyway. I would worry that adding more abstraction will create more excuses not to delve into the inner workings, which wouldn&#x27;t be to a programmer&#x27;s benefit. Interested to hear thoughts on this!
评论 #7490269 未加载
评论 #7490531 未加载
michaelsbradley大约 11 年前
Chris, have you read Prof. David Harel&#x27;s[1] essay <i>Can Programming be Liberated, Period?</i>[2]<p>The sentiments expressed in the conclusion of Harel&#x27;s article <i>Statecharts in the Making: A Personal Account</i>[3] really jumped out at me last year. When I read your blog post, I got the impression you are reaching related conclusions:<p>&quot;If asked about the lessons to be learned from the statecharts story, I would definitely put tool support for executability and experience in real-world use at the top of the list. Too much computer science research on languages, methodologies, and semantics never finds its way into the real world, even in the long term, because these two issues do not get sufficient priority.<p>One of the most interesting aspects of this story is the fact that the work was not done in an academic tower, inventing something and trying to push it down the throats of real-world engineers. It was done by going into the lion&#x27;s den, working with the people in industry. This is something I would not hesitate to recommend to young researchers; in order to affect the real world, one must go there and roll up one&#x27;s sleeves. One secret is to try to get a handle on the thought processes of the engineers doing the real work and who will ultimately use these ideas and tools. In my case, they were the avionics engineers, and when I do biological modeling, they are biologists. If what you come up with does not jibe with how they think, they will not use it. It&#x27;s that simple.&quot;<p>[1] <a href="http://www.wisdom.weizmann.ac.il/~harel/papers.html" rel="nofollow">http:&#x2F;&#x2F;www.wisdom.weizmann.ac.il&#x2F;~harel&#x2F;papers.html</a><p>[2] <a href="http://www.wisdom.weizmann.ac.il/~harel/papers/LiberatingProgramming.pdf" rel="nofollow">http:&#x2F;&#x2F;www.wisdom.weizmann.ac.il&#x2F;~harel&#x2F;papers&#x2F;LiberatingPro...</a><p>[3] <a href="http://www.wisdom.weizmann.ac.il/~harel/papers/Statecharts.History.CACM.pdf" rel="nofollow">http:&#x2F;&#x2F;www.wisdom.weizmann.ac.il&#x2F;~harel&#x2F;papers&#x2F;Statecharts.H...</a>
评论 #7489928 未加载
SlyShy大约 11 年前
Wolfram Language addresses a lot of these points. Equations and images both get treated symbolically, so we can manipulate them the same way we manipulate the rest of the &quot;code&quot; (data).
评论 #7489834 未加载
jonahx大约 11 年前
I love seeing the challenges of programming analyzed from this high-level perspective, and I love Chris&#x27;s vision.<p>I thought the `person.walk()` example, however, was misplaced. The whole point of encapsulation is to avoid thinking about internal details, so if you are criticizing encapsulation for hiding internal details you are saying that encapsulation <i>never</i> has any legitimate use.<p>I was left wondering if that was Chris&#x27;s position, but convinced it couldn&#x27;t be.
评论 #7490895 未加载
DanielBMarkham大约 11 年前
I&#x27;ve been lucky to write at least one small application per year, although most of my work is now on the creative side: books, videos, web pages, and such.<p>So I find myself getting &quot;cold&quot; and then coming back into it. The thing about taking a week to set up a dev environment is spot on. It&#x27;s completely insane that it should take a week of work just to sit down and write a for-next loop or change a button&#x27;s text somewhere.<p>The problem with programming is simple: it&#x27;s full of programmers. So every damn little thing they do, they generalize and then make into a library. Software providers keep making languages do more -- and become correspondingly more complex.<p>When I switched to Ocaml and F# a few years ago, I was astounded at how <i>little</i> I use most of the crap clogging up my programming system. I also found that while writing an app, I&#x27;d create a couple dozen functions. I&#x27;d use a couple dozen more from the stock libraries. And that was it. 30-40 symbols in my head and I was solving real-world problems making people happy.<p>Compare that to the mess you can get into just <i>getting started</i> in an environment like C++. Crazy stuff.<p>There&#x27;s also a serious structural problem with OOP itself. Instead of hiding complexity and providing black-box components to clients, we&#x27;re creating semi-opaque non-intuitive messes of &quot;wires&quot;. A lot of what I&#x27;m seeing people upset about in the industry, from TDD to stuff like this post, has its roots in OOP.<p>Having said all that and agreeing with the author, I&#x27;m a bit lost as to just what the heck he is ranting on about. I look forward to seeing more real tangible stuff -- I understand he&#x27;s working on it. Best of luck.
jakejake大约 11 年前
I liked the part of the article concerning &quot;what is programming&quot; and how we seemingly see ourselves plumbers and glue makers - mashing together various parts and trying to get them to work.<p>I felt that the article takes a somewhat depressing view. Sure, these days we probably do all spend a lot of time getting two pieces of code written by others to work together. The article suggests there&#x27;s no fun or creativity in that, but I find it plenty interesting. I see it as standing on the shoulders of giants, rather than just glumly fitting pipes together. It&#x27;s the payoff of reusable code and modular systems. I happily use pre-made web servers, operating systems, network stack, code libraries etc. Even though it can be frustrating at times when things don&#x27;t work, in the end my creations wouldn&#x27;t even be possible without these things.
jeffbr13大约 11 年前
I love Chris Granger&#x27;s work, and LightTable, but <i>jeeez</i> my eyes were going weird by the &quot;Chasing Local Maxima&quot; section.<p>Turn the contrast down!
评论 #7489360 未加载
评论 #7488944 未加载
arh68大约 11 年前
&gt; <i>programming is our way of encoding thought such that the computer can help us with it.</i><p>I really liked this. But I think we&#x27;re encoding <i>work</i>, not <i>thought</i>.<p>If I could add to the list of hard problems: cache invalidation, naming things, <i>encoding things</i>.<p>I think the problem in a lot of cases is that the language came first, then the problem&#x2F;domain familiarity comes later. When your language lines up with your problem, it&#x27;s just a matter of <i>implementing the language</i>. Your algorithms then don&#x27;t change over time, just the quality of that DSL&#x27;s implementation.
3rd3大约 11 年前
I think this article forgot to emphasize the act of reading documentation which probably takes 25% to 50% of the time programming. I think Google and StackOverflow already greatly improved it but maybe there is still room for improvement. Maybe one can crowd source code snippets in a huge Wikipedia-like repository for various languages. I’m imagining a context-sensitive auto-complete and search tool in which one can quickly browse this repository of code snippets which all are prepared to easily adapt to existing variables and function names.
anaphor大约 11 年前
Just a few quotes from Alan Perlis:<p>There will always be things we wish to say in our programs that in all known languages can only be said poorly.<p>Re graphics: A picture is worth 10K words - but only those to describe the picture. Hardly any sets of 10K words can be adequately described with pictures.<p>Make no mistake about it: Computers process numbers - not symbols. We measure our understanding (and control) by the extent to which we can arithmetize an activity.
andrewl大约 11 年前
Chris&#x27; criticisms of the current state of programming remind me of Alan Kay&#x27;s quote, &quot;Most software today is very much like an Egyptian pyramid with millions of bricks piled on top of each other, with no structural integrity, but just done by brute force and thousands of slaves.&quot;<p>Thank you for all the work on Light Table, and I&#x27;m looking forward to seeing what the team does with Aurora.
zapov大约 11 年前
As someone who is trying to improve the situation (<a href="https://dsl-platform.com" rel="nofollow">https:&#x2F;&#x2F;dsl-platform.com</a>) it&#x27;s strange getting feedback from other developers. While we are obviously not very good at marketing, when you talk to other developers about programming done on a higher level of abstraction usual responses are:<p>* I&#x27;m not interested in your framework (even if it&#x27;s not a framework) * so you&#x27;ve built another ORM just like many before you (even if there is no ORM inside it) * not interested in your language, I can get most of what I need writing comments in php (even if it&#x27;s not remotely the same)<p>It takes a lot of time to transfer some of the ideas and benefits to the other side and no, you can&#x27;t do it in a one minute pitch that average developer can relate to.
agentultra大约 11 年前
Visual representations are not terribly hard to come by in this day any age. It&#x27;s almost trivial to write a little script that can visualize your tree data-structures or relations. Plenty of good environments allow us to mingle all kinds of data.<p>I&#x27;m more interested in programs that understand programs and their run-time characteristics. It&#x27;d be nice to query a system that could predict regressions in key performance characteristics based on a proposed change (something like a constraint propagation solver on a data-flow graph of continuous domains); even in the face of ambiguous type information. Something like a nest of intelligent agents that can handle the complexity of implementation issues in concert with a human operator. We have a lot of these tools now but they&#x27;re still so primitive.
Locke1689大约 11 年前
The author is correct that programming is currently under-addressing a specific set of use cases: solving problems with conceptually simple models in equally simple ways; in other words, &quot;keep simple programs simple.&quot;<p>However, thinking about computation as only simple programs minimizes the opportunities in the opposite domain: using computation to supplement the inherently fragile and limited modeling that human brains can perform.<p>While presenting simplicity and understanding can help very much in realizing a simple mental model as a program, it won&#x27;t help if the program being written is fundamentally beyond the capability of a human brain to model.<p>The overall approach is very valuable. Tooling can greatly assist both goals, but the tooling one chooses in each domain will vary greatly.
sdgsdgsdg大约 11 年前
Programming is taking the patterns which make up a thought and approximate them in the patterns which can be expressed in a programming language. Sometimes the thoughts we have are not easily expressed in the patterns of the computer language which we write in. What is needed is a computer language which pulls the patterns from our thoughts and allows them to be used within the computer language. In other words we need to automatically determine the correct language in which to express the particular problem a user is trying to solve. This is AI, we need compression - modularisation of phase space through time. The only way to bring about the paradigm shift he is describing in any real sense is to apply machine learning to programming.
analyst74大约 11 年前
I am optimistic about our field.<p>Things have not stayed stale for the past 20~30 years, in fact, state of programming have not stayed stale even in the recent 10 years.<p>We&#x27;ve been progressively solving problems we face, inventing tools, languages, frameworks to make our lives easier. Which further allows us to solve more complicated problems, or similar problems faster.<p>Problems we face now, like concurrency, big data, lack of cheap programmers to solve business problems were not even problems before, they are now, because they are possible now.<p>Once we solve those problems of today, we will face new problems, I don&#x27;t know what they would be, but I am certain many of them would be problems we consider impractical or even impossible today.
评论 #7489726 未加载
评论 #7490278 未加载
programminggeek大约 11 年前
You want better programming? Get better requirements and less complexity. Programming languages and IDE&#x27;s are part of the problem, but a lot of the problems come from the actual program requirements.<p>In many cases, it&#x27;s the edge cases and feature creep that makes software genuinely terrible and by the time you layer in all that knowledge, it is a mess.<p>I don&#x27;t care if you use VIM, EMACS, Visual Studio, or even some fancy graphical programming system. Complexity is complexity and managing and implementing that complexity is a complex thing.<p>Until we have tools to better manage complexity, we will have messes and the best tool to manage complexity are communication related, not software related.
lstroud大约 11 年前
This seems reminiscent of the &quot;wolfram language&quot; stuff a couple of weeks ago. Perhaps it&#x27;s a trend, but I can&#x27;t shake the feeling like I am seeing a rehash of the 4GL fiasco of the 90s.<p>I have a lot of respect for Chris. So, I hope I am wrong.
3rd3大约 11 年前
I think a lot could be won by reducing complexity of the systems. In modern operating systems we stack too many abstraction layers ontop of each other. Emacs is a great example of a development environment which prevents a lot of complexity because everything is written in one language (Emacs Lisp), functions are available throughout the system, one can rewrite functions at runtime and one can easily pinpoint the source code of any function with the find-function command. It would actually be great to have an operating system as simple, extensible and flexible.
dmoney大约 11 年前
What I&#x27;d like for programming is a universal translator. Somebody writes a program in Java or Lisp, and I can read and modify it in Python and the author can read my changes in their own pet language. I write an Ant script and you can consume it with rubygems. You give me a program compiled into machine language or Java or .NET bytecode and I can read it in Python and run my modified version in the JVM, CLR, Mac, iPhone, Android, browser. Transparently, only looking at what the source language was if I get curious.
NAFV_P大约 11 年前
&gt; <i>Writing a program is an error-prone exercise in translation. Even math, from which our programming languages are born, has to be translated into something like this:</i><p>The article then compares some verbose C++ with a mathematical equation. That is hardly a fair comparison, the C++ code can be written and read by a human in a text editor, right click the equation &gt; inspect element ... it&#x27;s a gif. I loaded the gif into a text editor, it&#x27;s hardcore gibberish.<p>Personally, I would stick with the verbose C++.
datawander大约 11 年前
I wholly agree with this article. The exact point the author is getting at is something that I have been trying to say, but rather inarticulately (probably because I didn&#x27;t actually go out and survey people and define &quot;what is programming and what is wrong with it&quot;).<p>I really can&#x27;t wait for programming to be more than just if statements and thinking about code as a grouping of ascii files and glueing libraries together. Things like Akka are nice steps in that direction.
mc_hammer大约 11 年前
i have to disagree somewhat. imho the difference is in abstraction. i think good forms of abstraction have allowed computing proceed as far as it has, and will allow it to proceed further.<p>i think abstraction may correllate with a ide or librarys usefulness, popularity, and development time, moreso than what your video demonstrates.<p>i have a question, how many clicks would getting this snippet from above to work?<p>you also have to navigate various dropdown menus? (dropdowns are pretty terrible UI, and i would think reading diff dropdown lists im not familar with would be jarring.) IMHO it would be like writing software with 2 mouse buttons, dropdowns or other visual elements, and instead of with keyboard, and would actually be slower. the opposite of my point above<p><pre><code> #include &lt;valarray&gt; #include &lt;iostream&gt; double standard_dev(const std::valarray&lt;double&gt; &amp;vals) { return sqrt(pow(vals - (vals.sum() &#x2F; vals.size()), 2).sum() &#x2F; vals.size()); } int main() { std::cout &lt;&lt; standard_dev({2, 4, 4, 4, 5, 5, 7, 8}) &lt;&lt; &#x27;\n&#x27;; }</code></pre>
e12e大约 11 年前
I&#x27;m wondering, did the author ever play with Smalltalk&#x2F;Self? Essentially those environments let you interact with objects directly, in about as much as makes sense. Seems a good fit for the &quot;card game&quot; complaint.<p>Doesn&#x27;t help with the mathematical notation, though (Although it would be possible to do something about that, I suppose).
DennisP大约 11 年前
I hope the production release will be editable by keyboard alone, instead of needing the mouse for every little thing.
评论 #7489945 未加载
AdrianRossouw大约 11 年前
man. i&#x27;ve been thinking about this stuff a lot.<p>especially after I saw rich hickey&#x27;s presentation &quot;simple made easy&quot; (my notes on it [1]).<p>I&#x27;m actually on a mission now to find ways to do things that are more straight forward. One of my finds is [2] &#x27;microservices&#x27;, which I think will resonate with how I perceive software these days.<p>[1] <a href="http://daemon.co.za/2014/03/simple-and-easy-vocabulary-to-describe-software-complexity" rel="nofollow">http:&#x2F;&#x2F;daemon.co.za&#x2F;2014&#x2F;03&#x2F;simple-and-easy-vocabulary-to-de...</a> [2] <a href="http://martinfowler.com/articles/microservices.html" rel="nofollow">http:&#x2F;&#x2F;martinfowler.com&#x2F;articles&#x2F;microservices.html</a>
clavalle大约 11 年前
I&#x27;m intrigued.<p>This is a problem that many, many very smart people have spent careers on. Putting out a teaser post is brave and I have to believe you know what you are doing.<p>I am looking forward to the first taste. Do you have an ETA ?
ilaksh大约 11 年前
I have been saying stuff like this for years, although not as eloquently or detailed. But now Chris Granger is saying it, and no one can say he&#x27;s not a &quot;real&quot; programmer, so you have to listen.<p>I think it boils down to a cultural failure, like the article mentions at the end. For example, I am a programmer myself. Which means that I generate and work with lots of static, cryptic colorful ASCII text program sources. If I stop doing that, I&#x27;m not a programmer anymore. By definition. I really think that is the definition of programming, and that is the big issue.<p>I wonder if the current version of Aurora derives any inspiration from &quot;intentional programming&quot;?<p>Also wonder when we can see a demo of the new version.
评论 #7490425 未加载
leishulang大约 11 年前
Sounds so philosophical ... almost sounds like something to do with how to get strong A.I and expecting some sort of universal answer ... such as 42.
hibikir大约 11 年前
There are entire families of problems that would be better solved with a far more visual approach to code. For instance, worrydream has some UX concepts on learnable programming that just feel much better than what we use today.<p>We could do similar things to visualize actor systems, handle database manipulation and the like. The problem is that all we are really doing is asking for visualization aids that are only good at small things, and we have to build them, one at a time. Without general purpose visualizations, we need toolsets to build visualizations, which needs more tools. It&#x27;s tools all the way down.<p>You can build tools for a narrow niche, just like the lispers just build their DSLs for each individual problem. But even in a world without a sea of silly parenthesis and a syntax that is built for compilers, not humans, under every single line of easy, readable, domain-centric code lies library code that is 100% incidental complexity, and we can&#x27;t get rid of it.<p>Languages are hard. Writing code that attempts to be its own language is harder still. But those facts are not really the problem: They are a symptom. The real problem is that we are not equipped to deal with the detail we need to do our jobs.<p>Let&#x27;s take, for instance, our carefree friends that want to build contracts on top of Bitcoin, by making them executable. I am sure a whole lot of people here realize their folly: The problem is that no problem that is really worth putting into a contract is well defined enough to turn it into code. We work with a level of ambiguity that our computers can&#x27;t deal with. So what we are doing, build libraries on top of libraries, each a bit better, is about as good a job as we can do.<p>I do see how, for very specific domains, we can find highly reusable, visual high level abstractions. But the effort required to build that, with the best tools out there, just doesn&#x27;t make any practical sense for a very narrow domain: We can build it, but there is no ROI.<p>I think the best we can do today is to, instead of concentrate so much on how shiny each new tool really is, to go back to the real basics of what makes a program work. The same things that made old C programs readable works just as well in Scala, but without half the boilerplate. We just have to forget about how exciting the new toys can be, or how smart they can make us feel, and evaluate them just on the basis of how can they really help us solve problems faster. Applying proper technique, like having code that has a narrative and consistent abstraction levels, will help us build tools faster, and therefore make it cheaper to, eventually, allow for more useful general purpose visualization plugins.
sgy大约 11 年前
<a href="http://www.paulgraham.com/progbot.html" rel="nofollow">http:&#x2F;&#x2F;www.paulgraham.com&#x2F;progbot.html</a>
GnarfGnarf大约 11 年前
Chris Granger sure doesn&#x27;t make it easy to contact him.
aoakenfo大约 11 年前
demonstrates an immediate connection with their tool: <a href="http://vimeo.com/36579366" rel="nofollow">http:&#x2F;&#x2F;vimeo.com&#x2F;36579366</a>