From BlenderWiki

Jump to: navigation, search

Proposal to Remove Features before 2.6

Here's a list of features that I'd like to disable or remove before the release, and a few defaults I'd like to change. I'm restraining myself when it comes to things related to the particle or material system, since those are likely to undergo bigger changes later on anyway.

Removing features helps makes the code easier to maintain, makes the UI cleaner, and allows us to close a few bugs. Before commenting, I would like you to ask yourself:

  • Would this feature be a good thing to add if it wasn't there yet?
  • Is the feature actually usable/used for use case X or are you opposed to removing the feature because you think you might be able to use it but have not verified it actually does what you want?

There is a second list with more controversial things that I would like to remove, I don't expect much support for these, but if the decision was only mine I would remove them. Added to get an idea of what other people think, don't worry, I'm not going to remove them unless there is enough support.

-- Brecht 17:13, 8 July 2010 (UTC)

Since these changes are not in the new 2.53 Beta i have to ask (because i have to know for my book about 2.5): Will any of these changes affect the stable 2.5 Version? Or will they just make it into 2.6?
--Hoshi 19:01, 22 July 2010 (UTC)

General Agreement

  • Vertex normal flip: this does more harm than good in my experience, and actually makes rim lighting look wrong. Already removed in the render branch, I'd like to port that over to trunk. Note that this is not the Flip Normal tool in mesh edit mode, it is a rendering option.
  • Instances option for raytracing: we can just enable this by default , just like the rest of the rendering system does this automatically.
  • Edge Rendering: make this a compositing node.
  • B-Bone Rest: at least remove from the UI, and perhaps even the code. Was a compatibility option to keep broken behavior.
  • Pin Floating Panels: there aren't any, so option should not be there.
  • Snap always for Translate/Rotate/Scale: seems not necessary, just enable snap in the 3d view header if you always want to snap.
  • Outline Selected, All Object Origins and Relationship Lines in the 3d view should become user preferences.
  • Relative paths user preference and Remap Relative option for save operator: can we just enabled these two by default? These two combined should make relative paths work completely automatic quite well, or are there cases that this doesn't cover?
  • File browser should hide hidden files and filter types by default. Every file browser does this, and it just seems to be what you want nearly always anyway. Especially on mac/unix hidden files are really in the way in the home directory.
  • Optimal display for multires and subsurf enabled by default.
  • All edges option for meshes: useful for dense meshes, but only if they are not too dense, because then they completely disappear. Just always enable it.
  • Particle jittered Particle/Face option and Jittering amount options: workarounds for poor distribution behavior.
  • Grid mesh primitive: can use a plane instead.
--ScaroDj 15:17, 10 July 2010 (-6 GTM)
Ditch the plane and name the grid, plane. What everybody does is to add a plane and then subdivide or multi-loop-cut. It's very useful to have a grid already made, the problem is that people is lazy and don't look down the list.

Sure, this one can be removed

  • Classical shadow buffer: change default to Classical-Halfway.
  • Some Time Offset features: the time offset value itself it still useful, but a few other options aren't:
    • Particle: hasn't done anything for years.
    • Edit: no longer does anything in 2.5, at least remove from UI or fix.
    • Parent: also marked as deprecated in 2.5, remove from UI or fix.

No Agreement Yet

  • Console window on Windows: this is not used on Mac/Linux and should only contain messages for developers. If e.g. python developers want to see printed messages, they can run blender from the console.
--hoshi 22:03, 9 July 2010 (UTC)
Oh yes, please. Never really liked this one. Wouldn't this be better placed somewhere in the console editor? Or at least as a optional startup parameter?
--Keith 20:48, 9 July 2010 (UTC)
No. The console window should stay as it is but its operation changed to easily display or hide the console window as desired or needed. It needs to be dynamic, easily accessible, control. Default: Off. When the info is desired, it is a very useful tool. In Windows, I have went through great pains to create the functionality to pipe data to a (normally non-existent) console window.
--hoshi 11:12, 11 July 2010 (UTC)
Then how about just showing it when started from CMD? A simple .bat file with "blender.exe" in it would show that window then, right?
The Yafaray Community makes good use of the console window. The most important use is the gathering of Debug information when some person posts a crash or error on the forums. If the standard windows user didn't have access to the console it would be much more of a process to get that information from them. Also the console window shows progress information during the render. I believe all of this is true for the new 2.5/6 exporter
--ScaroDj 13:45, 10 July 2010 (-6GTM)
I second Hoshi's motion. I think all of Keith's goals can be achieved making it an editor, this is the one thing that breaks Blender's non-modality (along with the new User Preferences Editor, if you open it from the File Menu).
--Blendenzo 16:12, 17 July 2010 (UTC)
Why not make it an option in the user preferences menu and allow people to turn it off if they want to? This makes the most sense to me, since it allows for both users who utilize the console window (Python devs and BGE users) and those who find it an annoyance. I'm often debugging game scripts, and I find the extra step of launching from a terminal emulator on Linux to be counter-productive. IMO, taking away the console window on Windows would be taking away a convenience feature.
--Hoshi 08:22, 23 July 2010 (UTC)
I think it would be better to make it turned off first and let it be turned on in the system settings. Whoever wants to use the program and needs not to debug it (which is maybe most of the users) should not see things that cofuses (or even annoys) them. Whoever needs to debug is well aware how to turn on/off system settings. Not so first time users.
--Frr 10:40, 29 July 2010 (UTC)
The console window is plain bad UI design. It makes task switching harder, confuses users with superfluous information, and breaks Blender's whole UI paradigm. It also terminates Blender without warning if it is closed. Because it's a part of Windows and not a part of Blender, Blender developers can't change its functionality. Debug output should be written to a space within Blender instead. And to a log file as well, in case of crashes. Problem solved.
  • Texture and sequencer plugins: obviously plugins are useful, but it's possible to make a much better system that fits in the 2.5 design. I don't see the point of bringing them pack in the current form, and they were not much used or properly supported in 2.4x anyway. We can just remove all the code in my opinion.
--Jansende 14:03, 9 July 2010 (UTC)
I like the Sequencer. It makes blender an easy to use video-editor with lots of functions. I use it often to edit my videos. But I agree its quite unhandy in some ways. So you may kick it of the new release and write a better one. :)
--Brecht 17:43, 9 July 2010 (UTC)
First, I'm not proposing to remove the sequencer, only the plugin system. Note plugins are currently not functional in 2.5. I don't have a timeframe for when a new plugin system would be added. But I also think that bringing back the old system temporarily would be a waste of time, both for developers and for users who soon after will have to switch to another system anyway.
--Jansende 21:06, 9 July 2010 (UTC)
Ahhh. You mean things like the Import/Export-Plugins? Yeah they are really creepy and unhandy. They should definetly be replaced. Most ennoying is the sound handling. :( If you don't convert right you get crap in blender. :((
  • World stars: for 80ies space animations this may be useful, but if you want something that actually looks good, you should use a texture.
--Bartrobinson 13:38, 9 July 2010 (UTC)
Keep stars. I don't know about 80's space animations, but I use this for visualization of 2010 space animation.
--Brecht 17:43, 9 July 2010 (UTC)
I think this can be replaced quite well with a particle system, with the added advantage that the look is actually configurable. Should we have a moon rendering system, a nebula system, etc? Those things are useful but best left to addons.
--Zeauro 11:07, 10 July 2010 (UTC)
It is not as easy as it seems to replace that with a particle system.If the mesh emitter is out of camera clipping; a system with no physics with volume emission is not renderred.
--ScaroDj 14:07, 10 July 2010 (-6 GTM) (Sorry, idk how to do this)
I always thought that it was ridiculous to begin with, I would like a procedural star texture in the Texture Panel, though.
--hoshi 11:12, 11 July 2010 (UTC)
A procedural star texture would not be really 3D as the stars are right now. I use them with defocus and it looks really nice. this wouldn't work with a flat procedural. If this would be replaced by a particle system all old blend projects usingthis feature would miss the stars right? Or would blender then replace the stars with a particle system automatically?
--Zazizizou 15:20, 11 July 2010 (UTC)
Maybe the stars are not very used by professionals but it is very important for beginners and intermediate users. I think that this depends on the user's skills mostly but the stars makes the use of Blender generally easier. I'd love to keep this feature.


  • Texture mesh: is anyone actually using this? It's useful in principle but seems like one of those things that was added because it was useful in a neogeo project but not used afterwards, and now is just a confusing option in the UI.
--Brecht 17:43, 9 July 2010 (UTC)
Alternatively this could be moved to the texture mapping panel, at least then it becomes clear what it is used for.
  • Object Color: for games this is useful, but otherwise I think it's a bad feature, so I propose to only show it in game engine mode. Note that this is not a wire frame color or solid view drawing color, it is a texface/material option.
--Jansende 21:25, 9 July 2010 (UTC)
I found it. And I wondered what does this option do? I didn't find out. Can you please explain it.
... I think, if you can't get out whats changing, when you use an option, this one might be very useless.
  • Sticky texture coordinates: doubt this is much used in practice. Corrected perspective interpolation however is useful, could be added back as an option for UV's.
--Pixelminer 21:11, 9 July 2010 (UTC)
If sticky texture coordinates are removed then there should still be some way to use a camera to project UVs onto surfaces. If not, I say keep it in since sticky textures are useful for projected matte paintings and such.
  • Animateable object layers: not compatible with the dependency graph. Visibility/renderability can be animated instead.
--Zeauro 12:30, 11 July 2010 (UTC)
I think that you should not reduce the use of animateable objects layers to appearing/disappearing things. It could be good to use this in compositing. For example, to pass objects from a freestyle renderlayer to a photorealistic renderlayer.
  • Curve parent animation: replaced by constraint.
--Hoshi 11:20, 11 July 2010 (UTC)
I think i agree. I always wondered why there were two different ways for the same thing. As long as there is full control over start, end and duration of the followed path (and so on).
  • Curves in the image editor: replace by exposure slider. If you really need this much control, use compositing nodes.
  • Irregular Shadow Buffer: raytracing can be used instead.
--calli 14:19, 9 July 2010 (UTC)
Speed?
--Brecht 17:34, 9 July 2010 (UTC)
I think the main advantage is that you can disable raytracing, besides that I don't think there is a big speed difference.
--Otomo 13:12, 22 July 2010 (UTC)
This is how I could succeed in some renders/videos, I used this shadows disabling raytracing + some compositing magic (SSAO in that time and other things) + using faked reflexions and I could finish in time and with acceptable quality. I think that this versatility could be necessary in some works. Also, maybe I am wrong in this thought, I think that even with raytracing turned on, its a cheap and fast way to put shadows.
  • Particle Line rendering: doesn't seem like you can create good looking results with this.
--Alexsani 17:13, 9 July 2010 (UTC)
I have used this for sparks renders, and it is not very high quality. If you could make a path-type version which would have a strand render for the path traveled by the particle, that would help greatly.
--Zeauro 16:18, 10 July 2010 (UTC)
Other interesting things than realistic sparks can be done with line rendering. [1]
  • Sequencer Glow effect: seems out of place.
--ScaroDj 15:07, 10 July 2010 (-6 GTM)
If you could process the sequences through the Node Editor, it would make Blender a really powerful video editing tool, I think. Making a new node in the compositor (Add → Input → Sequence Layer) so that the only effects in the Sequencer would be the transition ones. So, yes, it is out of place.
--B3gin 10:49, 11 July 2010 (-5 UTC)
I agree with ScaroDj. The node system with the ability to add the same effect to all/most/some of an animation would be most useful. But, the Glow effect should be kept for now because it is as close as we have.(and the glow effect is very useful but time consuming in nodes.)
  • Slow Parent: useful, but doesn't actually work correct with the current animation system. We could still keep it working in the game engine.
--calli 14:19, 9 July 2010 (UTC)
many camera controls in the BGE rely on SlowParent so at least keep it for BGE.
--cog 14:39, 9 July 2010 (UTC)
This list is beginning to remind me of an old adage about a bus company whose busses stopped picking up passengers.
When asked why they were doing this the manager replied. “If we had to stop for passengers we would never be able to keep up with the timetable”.
Meaning know what your business is not what makes the job of running it easier.
--Blendenzo 16:12, 17 July 2010 (UTC)
I agree with keeping slow parent functional in the game engine. If it is not working properly in the blender animation system, then it could be made visible only when the Blender Game render engine is selected.
  • Invert Z Depth material option: weird rasterization trick, doesn't seem very useful. Yes, you can do certain Escher type renderings with, but do we really need native support for that kind of thing?
  • Fields rendering: is anyone actually using Blender to produce content for TV with this option? For example, it doesn't work with compositing and never has. Can't we leave this to specialized tools?
--Bartrobinson 13:38, 9 July 2010 (UTC)
Keep Field Rendering. Very important! In order to directly support television standards, rendering of fields is required. If it's not supported 4 times as much data needs to be rendered as is required when field rendering to get the same smooth motion quality. I've used it on several for TV and DVD projects and commercials and independent films. It would be nice if it were actually better supported.
--calli 14:35, 9 July 2010 (UTC)
Keep! Still there are many DV cam/recorders working and not everybody has a LCD Screen as TV. 50i images are much smoother on such devices.
Brecht 17:53, 9 July 2010 (UTC)
My point is not that interlaced video is outdated, but the way it is implemented in Blender makes it of limited used. A better alternative is to render in double the framerate and encode it as interlaced at the last point in the pipeline. Besides that, Blender has never been able to save an interlaced video file that can be played correctly in a video player.
--Bartrobinson 15:43, 15 July 2010 (UTC)
Good, because it's not outdated. It's still very much a part of SD and HD TV specs. Rendering at double the frame rate both increases planning on the animators part and also DOUBLES, if not QUADRUPLES (twice the frames and twice the vertical resolution), the render time on the renderers part. Then it takes time to perform a final pass for interlacing in some other post program. Why would I want to add more than two-four times the work and time required to finish a shot, when there is a feature that saves so much of the time already? In projects where time is critical, we never render to VIDEO files anyway. We render frames on a render farm in case of a crash. It's still a critical time saving feature is a lot of cases.
  • Fields option for images: same reasoning as above. Also no one noticed this has been broken for years.
Field Rendering is very important for me and I really think that it's time for field rendering to be supported with the compositor.
--Bartrobinson 13:38, 9 July 2010 (UTC)
Repair Fields option for images, and enhance. If this were working, it would be a huge time saver for loading in interlaced (professional video) for texturing and comping.
--speaker_mute18:12, 9 Uly 2010 . In the case of the snap the translation snap is pretty useful for atached verts but rotation and scaling snap i think so that should dissapear

No Suitable Replacement

These won't be removed since there is nothing good enough to replace them.

  • Environment Map texture: raytracing can be used instead.
--Bartrobinson 13:50, 9 July 2010 (UTC)
Ya, except when raytracing is too slow or you need to make a FAKE reflection. Keep Environment Map Textures.
--Papalagi 13:50, 9 July 2010 (UTC)
Env Map texturing is fundamental, both for rendering speed and special effects. We should have it.
--cog 14:32, 9 July 2010 (UTC)
Raytrace Mirror is currently far too slow for many shots. We don't all have access to super fast 8Gb Ram super computers and while Environment map wont solve all mirror reflection shots it is incredibly easy, render quick, and painless to create what you want.
--ScaroDj 15:17, 10 July 2010 (-6 GTM)
Too useful for the BGE and backwards compatibility. For example in the back window of a racing car.
Very usefull to create angularmaps for backgrounds (if you convert the environmentmap: [2]). Maybe should be implemented directly. No need for an extra camera type. (/me dreaming of animated angularmaps) -- 79.213.218.129 03:35, 11 July 2010 (UTC)
  • Blend Sky (no horizon/zenith, just one color): can use a texture instead, if you want to have a more interesting background a gradient is not that useful.
--Bartrobinson 13:47, 9 July 2010 (UTC)
Keep Blend Sky. Great for quick and dirty renders, which are just as important as pretty and more elaborate renders.
--Jansende 14:11, 9 July 2010 (UTC)
What, if I want to have a blend Sky? I don't want to create an extra texture for this!
--ScaroDj 15:17, 10 July 2010 (-6 GTM)
This one sounds to me as removing the cube primitive. Definitely, keep it.
--cog 14:22, 9 July 2010 (UTC)
Remember Background is not only used for final renders it can be very useful when creating materials as you can quickly generate the color contrast a material might be used in. Having to go to another part of blender to create another texture for such purposes really is counter productive. I know you said this was a controversial list but unfortunately coder doesn’t always know best. Rant over. ;)
  • Fresnel diffuse shader: not really a diffuse shader.
--yuutsumi 14:28, 9 July 2010 (CDT)
I use Fresnel quite a bit on objects in the BGE when the details of shading are in the texture maps rather than the object mesh. I know I can get the same look by adjusting the 'Toon' texture. But for stationary objects with a high quality texture providing the details and ambience, Fresnel seems to reduce some overhead since its a 'constant color' shader. I'd feel better about losing it if you added a "shadeless" checkbox functionality to the Diffuse Material Properties.